Opacity is a term healthcare professionals use in reference to a lung CT scan.
This is a radiological term that refers to the hazy gray areas on images made by CT scans or X-rays.
Typically, the lungs appear black on a CT scan or X-ray. This shows that they are free of blockages. When gray areas are visible instead,
it means that something is partially filling this area inside the lungs.
These gray areas are referred to as ground-glass opacity. Ground-glass opacity can be a sign of:
- fluid, pus, or cells filling the air space
- walls of the alveoli thickening
- space between the lungs thickening
Opacities in the lungs can be caused by a variety of both acute and chronic concerns. Some potential reasons for lung opacity
include:
- pneumonia
- COVID-19
- pneumonitis
- EVALI
- interstitial lung disease
- pulmonary edema
- alveolar hemorrhage
- lung cancer
Pneumonia is an infection in one or both lungs caused by bacteria, viruses, or fungi. The infection leads to inflammation in the air sacs
of the lungs, which are called alveoli. The alveoli fill with fluid or pus, making it difficult to breathe.
An X-ray helps your doctor look for signs of inflammation in your chest. If inflammation is present, the X-ray can also inform your doctor
about its location and extent.
CT scans provide a clearer and more detailed picture of your lungs.
Medical images are stored in a special format called DICOM files (*.dcm). They contain a combination of header metadata as well as
underlying raw image arrays for pixel data.
import platform
import socket
import psutil
print("\n\nAVAILABLE SYSTEM RESOURCES FOR CAPSTONE PROJECT:\n")
print("***************************************************************************\n\n")
print("PLATFORM - ",platform.system())
print("")
print("PLATFORM - RELEASE - ",platform.release())
print("")
print("PLATFORM - VERSION - ",platform.version())
print("")
print("ARCHITECTURE - ",platform.machine())
print("")
print("HOST NAME - ",socket.gethostname())
print("")
print("PROCESSOR - ",platform.processor())
print("")
print("RAM - ",str(round(psutil.virtual_memory().total / (1024.0 **3)))+" GB")
print("\n\n***************************************************************************\n\n")
AVAILABLE SYSTEM RESOURCES FOR CAPSTONE PROJECT: *************************************************************************** PLATFORM - Windows PLATFORM - RELEASE - 10 PLATFORM - VERSION - 10.0.22621 ARCHITECTURE - AMD64 HOST NAME - MSI PROCESSOR - Intel64 Family 6 Model 141 Stepping 1, GenuineIntel RAM - 8 GB ***************************************************************************
# Import all the necessary libraries
import numpy as np
import pandas as pd
%matplotlib inline
import matplotlib.pyplot as plt
import matplotlib.patches as patches
import matplotlib.gridspec as gridspec
import seaborn as sns
from PIL import Image
from IPython.display import display
import os
import shutil
from pathlib import Path
import pathlib
from zipfile import ZipFile
from glob import glob
import six.moves.urllib as urllib
import sys
import tarfile
from collections import defaultdict
from io import StringIO
import cv2
import pydicom as dicom
from pydicom.pixel_data_handlers import convert_color_space
from pydicom.pixel_data_handlers.util import apply_color_lut
from fastai.basics import *
from fastai.callback.all import *
from fastai.vision.all import *
from fastai.medical.imaging import *
from tqdm import tqdm
import pickle as pkl
from sklearn.preprocessing import LabelEncoder, OneHotEncoder
from sklearn.model_selection import train_test_split
from sklearn import metrics
from sklearn.metrics import roc_auc_score
from sklearn.metrics import confusion_matrix
from sklearn.metrics import plot_confusion_matrix
from sklearn.metrics import ConfusionMatrixDisplay
from sklearn.metrics import classification_report
from sklearn.metrics import accuracy_score, precision_score, recall_score, f1_score
import tensorflow as tf
from tensorflow.keras.utils import to_categorical
from tensorflow.keras.models import Sequential # initial NN
from tensorflow.keras.layers import Activation, Dense, Dropout, BatchNormalization # construct each layer
from tensorflow.keras.layers import Conv2D, SeparableConv2D
from tensorflow.keras.layers import MaxPooling2D # swipe across by pool size
from tensorflow.keras.layers import Flatten, GlobalAveragePooling2D, GlobalMaxPooling2D
from tensorflow.keras.optimizers import Adam, SGD, RMSprop
from tensorflow.keras.preprocessing import image
from tensorflow.keras.metrics import Precision, Recall, SparseCategoricalAccuracy
from tensorflow.keras.applications.vgg16 import preprocess_input,decode_predictions
from tensorflow.keras.applications.inception_v3 import preprocess_input
from tensorflow.keras.applications import MobileNet
from tensorflow.keras.applications.mobilenet_v2 import MobileNetV2
from tensorflow.keras.applications.densenet import DenseNet169
from tensorflow.keras.applications.densenet import DenseNet121
from tensorflow.keras.models import Model
from tensorflow.keras.layers import Conv2D, Reshape
from keras.callbacks import EarlyStopping
from keras.utils.vis_utils import plot_model
from keras.models import model_from_json
from keras.preprocessing.image import ImageDataGenerator
from keras.applications.imagenet_utils import preprocess_input
from keras.callbacks import ModelCheckpoint
from keras.callbacks import ReduceLROnPlateau
from keras.applications.vgg16 import VGG16
from keras.layers import Input
from keras.callbacks import EarlyStopping,ReduceLROnPlateau
from object_detection.utils import ops as utils_ops
from object_detection.utils import label_map_util
from object_detection.utils import visualization_utils as vis_util
import warnings
warnings.filterwarnings('ignore')
* Loading the Label and Class info and merging them together
train_label = pd.read_csv("stage_2_train_labels.csv")
train_class = pd.read_csv("stage_2_detailed_class_info.csv")
train_label.head(20).style.set_properties(**{'border': '1.3px solid black','color': 'blue'})
| patientId | x | y | width | height | Target | |
|---|---|---|---|---|---|---|
| 0 | 0004cfab-14fd-4e49-80ba-63a80b6bddd6 | nan | nan | nan | nan | 0 |
| 1 | 00313ee0-9eaa-42f4-b0ab-c148ed3241cd | nan | nan | nan | nan | 0 |
| 2 | 00322d4d-1c29-4943-afc9-b6754be640eb | nan | nan | nan | nan | 0 |
| 3 | 003d8fa0-6bf1-40ed-b54c-ac657f8495c5 | nan | nan | nan | nan | 0 |
| 4 | 00436515-870c-4b36-a041-de91049b9ab4 | 264.000000 | 152.000000 | 213.000000 | 379.000000 | 1 |
| 5 | 00436515-870c-4b36-a041-de91049b9ab4 | 562.000000 | 152.000000 | 256.000000 | 453.000000 | 1 |
| 6 | 00569f44-917d-4c86-a842-81832af98c30 | nan | nan | nan | nan | 0 |
| 7 | 006cec2e-6ce2-4549-bffa-eadfcd1e9970 | nan | nan | nan | nan | 0 |
| 8 | 00704310-78a8-4b38-8475-49f4573b2dbb | 323.000000 | 577.000000 | 160.000000 | 104.000000 | 1 |
| 9 | 00704310-78a8-4b38-8475-49f4573b2dbb | 695.000000 | 575.000000 | 162.000000 | 137.000000 | 1 |
| 10 | 008c19e8-a820-403a-930a-bc74a4053664 | nan | nan | nan | nan | 0 |
| 11 | 009482dc-3db5-48d4-8580-5c89c4f01334 | nan | nan | nan | nan | 0 |
| 12 | 009eb222-eabc-4150-8121-d5a6d06b8ebf | nan | nan | nan | nan | 0 |
| 13 | 00a85be6-6eb0-421d-8acf-ff2dc0007e8a | nan | nan | nan | nan | 0 |
| 14 | 00aecb01-a116-45a2-956c-08d2fa55433f | 288.000000 | 322.000000 | 94.000000 | 135.000000 | 1 |
| 15 | 00aecb01-a116-45a2-956c-08d2fa55433f | 547.000000 | 299.000000 | 119.000000 | 165.000000 | 1 |
| 16 | 00c0b293-48e7-4e16-ac76-9269ba535a62 | 306.000000 | 544.000000 | 168.000000 | 244.000000 | 1 |
| 17 | 00c0b293-48e7-4e16-ac76-9269ba535a62 | 650.000000 | 511.000000 | 206.000000 | 284.000000 | 1 |
| 18 | 00d7c36e-3cdf-4df6-ac03-6c30cdc8e85b | nan | nan | nan | nan | 0 |
| 19 | 00f08de1-517e-4652-a04f-d1dc9ee48593 | 181.000000 | 184.000000 | 206.000000 | 506.000000 | 1 |
train_label.duplicated().sum()
0
train_class.head(20).style.set_properties(**{'border': '1.3px solid black','color': 'blue'})
| patientId | class | |
|---|---|---|
| 0 | 0004cfab-14fd-4e49-80ba-63a80b6bddd6 | No Lung Opacity / Not Normal |
| 1 | 00313ee0-9eaa-42f4-b0ab-c148ed3241cd | No Lung Opacity / Not Normal |
| 2 | 00322d4d-1c29-4943-afc9-b6754be640eb | No Lung Opacity / Not Normal |
| 3 | 003d8fa0-6bf1-40ed-b54c-ac657f8495c5 | Normal |
| 4 | 00436515-870c-4b36-a041-de91049b9ab4 | Lung Opacity |
| 5 | 00436515-870c-4b36-a041-de91049b9ab4 | Lung Opacity |
| 6 | 00569f44-917d-4c86-a842-81832af98c30 | No Lung Opacity / Not Normal |
| 7 | 006cec2e-6ce2-4549-bffa-eadfcd1e9970 | No Lung Opacity / Not Normal |
| 8 | 00704310-78a8-4b38-8475-49f4573b2dbb | Lung Opacity |
| 9 | 00704310-78a8-4b38-8475-49f4573b2dbb | Lung Opacity |
| 10 | 008c19e8-a820-403a-930a-bc74a4053664 | No Lung Opacity / Not Normal |
| 11 | 009482dc-3db5-48d4-8580-5c89c4f01334 | Normal |
| 12 | 009eb222-eabc-4150-8121-d5a6d06b8ebf | Normal |
| 13 | 00a85be6-6eb0-421d-8acf-ff2dc0007e8a | Normal |
| 14 | 00aecb01-a116-45a2-956c-08d2fa55433f | Lung Opacity |
| 15 | 00aecb01-a116-45a2-956c-08d2fa55433f | Lung Opacity |
| 16 | 00c0b293-48e7-4e16-ac76-9269ba535a62 | Lung Opacity |
| 17 | 00c0b293-48e7-4e16-ac76-9269ba535a62 | Lung Opacity |
| 18 | 00d7c36e-3cdf-4df6-ac03-6c30cdc8e85b | No Lung Opacity / Not Normal |
| 19 | 00f08de1-517e-4652-a04f-d1dc9ee48593 | Lung Opacity |
train_final = pd.merge(train_label, train_class, on ='patientId')
train_final.head(20).style.set_properties(**{'border': '1.3px solid black','color': 'blue'})
| patientId | x | y | width | height | Target | class | |
|---|---|---|---|---|---|---|---|
| 0 | 0004cfab-14fd-4e49-80ba-63a80b6bddd6 | nan | nan | nan | nan | 0 | No Lung Opacity / Not Normal |
| 1 | 00313ee0-9eaa-42f4-b0ab-c148ed3241cd | nan | nan | nan | nan | 0 | No Lung Opacity / Not Normal |
| 2 | 00322d4d-1c29-4943-afc9-b6754be640eb | nan | nan | nan | nan | 0 | No Lung Opacity / Not Normal |
| 3 | 003d8fa0-6bf1-40ed-b54c-ac657f8495c5 | nan | nan | nan | nan | 0 | Normal |
| 4 | 00436515-870c-4b36-a041-de91049b9ab4 | 264.000000 | 152.000000 | 213.000000 | 379.000000 | 1 | Lung Opacity |
| 5 | 00436515-870c-4b36-a041-de91049b9ab4 | 264.000000 | 152.000000 | 213.000000 | 379.000000 | 1 | Lung Opacity |
| 6 | 00436515-870c-4b36-a041-de91049b9ab4 | 562.000000 | 152.000000 | 256.000000 | 453.000000 | 1 | Lung Opacity |
| 7 | 00436515-870c-4b36-a041-de91049b9ab4 | 562.000000 | 152.000000 | 256.000000 | 453.000000 | 1 | Lung Opacity |
| 8 | 00569f44-917d-4c86-a842-81832af98c30 | nan | nan | nan | nan | 0 | No Lung Opacity / Not Normal |
| 9 | 006cec2e-6ce2-4549-bffa-eadfcd1e9970 | nan | nan | nan | nan | 0 | No Lung Opacity / Not Normal |
| 10 | 00704310-78a8-4b38-8475-49f4573b2dbb | 323.000000 | 577.000000 | 160.000000 | 104.000000 | 1 | Lung Opacity |
| 11 | 00704310-78a8-4b38-8475-49f4573b2dbb | 323.000000 | 577.000000 | 160.000000 | 104.000000 | 1 | Lung Opacity |
| 12 | 00704310-78a8-4b38-8475-49f4573b2dbb | 695.000000 | 575.000000 | 162.000000 | 137.000000 | 1 | Lung Opacity |
| 13 | 00704310-78a8-4b38-8475-49f4573b2dbb | 695.000000 | 575.000000 | 162.000000 | 137.000000 | 1 | Lung Opacity |
| 14 | 008c19e8-a820-403a-930a-bc74a4053664 | nan | nan | nan | nan | 0 | No Lung Opacity / Not Normal |
| 15 | 009482dc-3db5-48d4-8580-5c89c4f01334 | nan | nan | nan | nan | 0 | Normal |
| 16 | 009eb222-eabc-4150-8121-d5a6d06b8ebf | nan | nan | nan | nan | 0 | Normal |
| 17 | 00a85be6-6eb0-421d-8acf-ff2dc0007e8a | nan | nan | nan | nan | 0 | Normal |
| 18 | 00aecb01-a116-45a2-956c-08d2fa55433f | 288.000000 | 322.000000 | 94.000000 | 135.000000 | 1 | Lung Opacity |
| 19 | 00aecb01-a116-45a2-956c-08d2fa55433f | 288.000000 | 322.000000 | 94.000000 | 135.000000 | 1 | Lung Opacity |
*Importing the Submission data
test_submission = pd.read_csv("stage_2_sample_submission.csv")
test_submission.head(20).style.set_properties(**{'border': '1.3px solid black','color': 'blue'})
| patientId | PredictionString | |
|---|---|---|
| 0 | 0000a175-0e68-4ca4-b1af-167204a7e0bc | 0.5 0 0 100 100 |
| 1 | 0005d3cc-3c3f-40b9-93c3-46231c3eb813 | 0.5 0 0 100 100 |
| 2 | 000686d7-f4fc-448d-97a0-44fa9c5d3aa6 | 0.5 0 0 100 100 |
| 3 | 000e3a7d-c0ca-4349-bb26-5af2d8993c3d | 0.5 0 0 100 100 |
| 4 | 00100a24-854d-423d-a092-edcf6179e061 | 0.5 0 0 100 100 |
| 5 | 0015597f-2d69-4bc7-b642-5b5e01534676 | 0.5 0 0 100 100 |
| 6 | 001b0c51-c7b3-45c1-9c17-fa7594cab96e | 0.5 0 0 100 100 |
| 7 | 0022bb50-bf6c-4185-843e-403a9cc1ea80 | 0.5 0 0 100 100 |
| 8 | 00271e8e-aea8-4f0a-8a34-3025831f1079 | 0.5 0 0 100 100 |
| 9 | 0028450f-5b8e-4695-9416-8340b6f686b0 | 0.5 0 0 100 100 |
| 10 | 002bcde0-d8da-4931-ab04-5d724e30261b | 0.5 0 0 100 100 |
| 11 | 002fcb77-ef76-4626-ab34-5070f15c20db | 0.5 0 0 100 100 |
| 12 | 003206b4-bd4a-4684-8d49-76f4cb713a30 | 0.5 0 0 100 100 |
| 13 | 00330f7f-d114-4eb2-9c6e-558eeb3084a1 | 0.5 0 0 100 100 |
| 14 | 00342ae8-ff81-4229-adf6-6a2ab711707b | 0.5 0 0 100 100 |
| 15 | 003d17f0-bd8a-485c-bc8b-daec33f53efa | 0.5 0 0 100 100 |
| 16 | 003dba79-1b1d-4713-add8-d72c54074f8a | 0.5 0 0 100 100 |
| 17 | 003ec9e3-512e-4f6e-923d-daa9f9f3db9a | 0.5 0 0 100 100 |
| 18 | 003fbda2-ba55-4714-a03a-83f15bec19e4 | 0.5 0 0 100 100 |
| 19 | 0041fc67-793c-4129-a952-ea3fb821b445 | 0.5 0 0 100 100 |
test_submission.duplicated().sum()
0
train_final.shape
(37629, 7)
train_final.describe()
| x | y | width | height | Target | |
|---|---|---|---|---|---|
| count | 16957.000000 | 16957.000000 | 16957.000000 | 16957.000000 | 37629.000000 |
| mean | 398.980008 | 360.443121 | 219.266675 | 337.799552 | 0.450636 |
| std | 204.869392 | 149.202409 | 59.195268 | 158.986899 | 0.497564 |
| min | 2.000000 | 2.000000 | 40.000000 | 45.000000 | 0.000000 |
| 25% | 209.000000 | 243.000000 | 178.000000 | 210.000000 | 0.000000 |
| 50% | 343.000000 | 355.000000 | 218.000000 | 309.000000 | 0.000000 |
| 75% | 596.000000 | 472.000000 | 259.000000 | 452.000000 | 1.000000 |
| max | 835.000000 | 881.000000 | 528.000000 | 942.000000 | 1.000000 |
train_duplicates_count = train_final.duplicated().sum()
train_duplicates_count
7402
train_duplicates = train_final[train_final.duplicated()]
train_duplicates.head(20).style.set_properties(**{'border': '1.3px solid black','color': 'blue'})
| patientId | x | y | width | height | Target | class | |
|---|---|---|---|---|---|---|---|
| 5 | 00436515-870c-4b36-a041-de91049b9ab4 | 264.000000 | 152.000000 | 213.000000 | 379.000000 | 1 | Lung Opacity |
| 7 | 00436515-870c-4b36-a041-de91049b9ab4 | 562.000000 | 152.000000 | 256.000000 | 453.000000 | 1 | Lung Opacity |
| 11 | 00704310-78a8-4b38-8475-49f4573b2dbb | 323.000000 | 577.000000 | 160.000000 | 104.000000 | 1 | Lung Opacity |
| 13 | 00704310-78a8-4b38-8475-49f4573b2dbb | 695.000000 | 575.000000 | 162.000000 | 137.000000 | 1 | Lung Opacity |
| 19 | 00aecb01-a116-45a2-956c-08d2fa55433f | 288.000000 | 322.000000 | 94.000000 | 135.000000 | 1 | Lung Opacity |
| 21 | 00aecb01-a116-45a2-956c-08d2fa55433f | 547.000000 | 299.000000 | 119.000000 | 165.000000 | 1 | Lung Opacity |
| 23 | 00c0b293-48e7-4e16-ac76-9269ba535a62 | 306.000000 | 544.000000 | 168.000000 | 244.000000 | 1 | Lung Opacity |
| 25 | 00c0b293-48e7-4e16-ac76-9269ba535a62 | 650.000000 | 511.000000 | 206.000000 | 284.000000 | 1 | Lung Opacity |
| 28 | 00f08de1-517e-4652-a04f-d1dc9ee48593 | 181.000000 | 184.000000 | 206.000000 | 506.000000 | 1 | Lung Opacity |
| 30 | 00f08de1-517e-4652-a04f-d1dc9ee48593 | 571.000000 | 275.000000 | 230.000000 | 476.000000 | 1 | Lung Opacity |
| 35 | 010ccb9f-6d46-4380-af11-84f87397a1b8 | 652.000000 | 437.000000 | 161.000000 | 293.000000 | 1 | Lung Opacity |
| 37 | 010ccb9f-6d46-4380-af11-84f87397a1b8 | 301.000000 | 405.000000 | 141.000000 | 279.000000 | 1 | Lung Opacity |
| 40 | 012a5620-d082-4bb8-9b3b-e72d8938000c | 133.000000 | 613.000000 | 275.000000 | 275.000000 | 1 | Lung Opacity |
| 42 | 012a5620-d082-4bb8-9b3b-e72d8938000c | 678.000000 | 427.000000 | 224.000000 | 340.000000 | 1 | Lung Opacity |
| 47 | 0174c4bb-28f5-41e3-a13f-a396badc18bd | 155.000000 | 182.000000 | 273.000000 | 501.000000 | 1 | Lung Opacity |
| 49 | 0174c4bb-28f5-41e3-a13f-a396badc18bd | 599.000000 | 220.000000 | 227.000000 | 508.000000 | 1 | Lung Opacity |
| 53 | 019d950b-dd38-4cf3-a686-527a75728be6 | 229.000000 | 318.000000 | 250.000000 | 301.000000 | 1 | Lung Opacity |
| 55 | 019d950b-dd38-4cf3-a686-527a75728be6 | 604.000000 | 216.000000 | 196.000000 | 328.000000 | 1 | Lung Opacity |
| 60 | 01a6eaa6-222f-4ea8-9874-bbd89dc1a1ce | 141.000000 | 306.000000 | 225.000000 | 327.000000 | 1 | Lung Opacity |
| 62 | 01a6eaa6-222f-4ea8-9874-bbd89dc1a1ce | 609.000000 | 285.000000 | 236.000000 | 355.000000 | 1 | Lung Opacity |
train_final_non_dups = train_final.drop_duplicates(keep='first')
train_final_non_dups = train_final_non_dups.reset_index(drop=True)
train_final.shape
(37629, 7)
train_final_non_dups.shape
(30227, 7)
37629-30227
7402
train_meta_df = train_final_non_dups.copy()
train_meta_df.head(20).style.set_properties(**{'border': '1.3px solid black','color': 'blue'})
| patientId | x | y | width | height | Target | class | |
|---|---|---|---|---|---|---|---|
| 0 | 0004cfab-14fd-4e49-80ba-63a80b6bddd6 | nan | nan | nan | nan | 0 | No Lung Opacity / Not Normal |
| 1 | 00313ee0-9eaa-42f4-b0ab-c148ed3241cd | nan | nan | nan | nan | 0 | No Lung Opacity / Not Normal |
| 2 | 00322d4d-1c29-4943-afc9-b6754be640eb | nan | nan | nan | nan | 0 | No Lung Opacity / Not Normal |
| 3 | 003d8fa0-6bf1-40ed-b54c-ac657f8495c5 | nan | nan | nan | nan | 0 | Normal |
| 4 | 00436515-870c-4b36-a041-de91049b9ab4 | 264.000000 | 152.000000 | 213.000000 | 379.000000 | 1 | Lung Opacity |
| 5 | 00436515-870c-4b36-a041-de91049b9ab4 | 562.000000 | 152.000000 | 256.000000 | 453.000000 | 1 | Lung Opacity |
| 6 | 00569f44-917d-4c86-a842-81832af98c30 | nan | nan | nan | nan | 0 | No Lung Opacity / Not Normal |
| 7 | 006cec2e-6ce2-4549-bffa-eadfcd1e9970 | nan | nan | nan | nan | 0 | No Lung Opacity / Not Normal |
| 8 | 00704310-78a8-4b38-8475-49f4573b2dbb | 323.000000 | 577.000000 | 160.000000 | 104.000000 | 1 | Lung Opacity |
| 9 | 00704310-78a8-4b38-8475-49f4573b2dbb | 695.000000 | 575.000000 | 162.000000 | 137.000000 | 1 | Lung Opacity |
| 10 | 008c19e8-a820-403a-930a-bc74a4053664 | nan | nan | nan | nan | 0 | No Lung Opacity / Not Normal |
| 11 | 009482dc-3db5-48d4-8580-5c89c4f01334 | nan | nan | nan | nan | 0 | Normal |
| 12 | 009eb222-eabc-4150-8121-d5a6d06b8ebf | nan | nan | nan | nan | 0 | Normal |
| 13 | 00a85be6-6eb0-421d-8acf-ff2dc0007e8a | nan | nan | nan | nan | 0 | Normal |
| 14 | 00aecb01-a116-45a2-956c-08d2fa55433f | 288.000000 | 322.000000 | 94.000000 | 135.000000 | 1 | Lung Opacity |
| 15 | 00aecb01-a116-45a2-956c-08d2fa55433f | 547.000000 | 299.000000 | 119.000000 | 165.000000 | 1 | Lung Opacity |
| 16 | 00c0b293-48e7-4e16-ac76-9269ba535a62 | 306.000000 | 544.000000 | 168.000000 | 244.000000 | 1 | Lung Opacity |
| 17 | 00c0b293-48e7-4e16-ac76-9269ba535a62 | 650.000000 | 511.000000 | 206.000000 | 284.000000 | 1 | Lung Opacity |
| 18 | 00d7c36e-3cdf-4df6-ac03-6c30cdc8e85b | nan | nan | nan | nan | 0 | No Lung Opacity / Not Normal |
| 19 | 00f08de1-517e-4652-a04f-d1dc9ee48593 | 181.000000 | 184.000000 | 206.000000 | 506.000000 | 1 | Lung Opacity |
train_meta_df.dtypes
patientId object x float64 y float64 width float64 height float64 Target int64 class object dtype: object
train_Lung_Opacity = train_meta_df[train_meta_df["class"] == "Lung Opacity"]
train_Normal = train_meta_df[train_meta_df["class"] == "Normal"]
train_Not_Normal = train_meta_df[train_meta_df["class"] == "No Lung Opacity / Not Normal"]
print("len of Lung opacity",len(train_Lung_Opacity))
print("len of Normal",len(train_Normal))
print("len of Not Normal",len(train_Not_Normal))
print("")
len of Lung opacity 9555 len of Normal 8851 len of Not Normal 11821
Lung_Opacity_Unique = train_Lung_Opacity["patientId"].unique()
Normal_Unique = train_Normal["patientId"].unique()
Not_Normal_Unique = train_Not_Normal["patientId"].unique()
print("No. of Unique Pneumonia PatientIds",len(Lung_Opacity_Unique))
print("No. of Unique Normal PatientIds",len(Normal_Unique))
print("No. of Unique Not Normal",len(Not_Normal_Unique))
No. of Unique Pneumonia PatientIds 6012 No. of Unique Normal PatientIds 8851 No. of Unique Not Normal 11821
print(6012+8851+11821)
26684
26684 are the Unique rows we get and this exactly matches with the training_images provided to us
print(Lung_Opacity_Unique[:5])
print("")
print(Normal_Unique[:5])
print("")
print(Not_Normal_Unique[:5])
print("")
['00436515-870c-4b36-a041-de91049b9ab4' '00704310-78a8-4b38-8475-49f4573b2dbb' '00aecb01-a116-45a2-956c-08d2fa55433f' '00c0b293-48e7-4e16-ac76-9269ba535a62' '00f08de1-517e-4652-a04f-d1dc9ee48593'] ['003d8fa0-6bf1-40ed-b54c-ac657f8495c5' '009482dc-3db5-48d4-8580-5c89c4f01334' '009eb222-eabc-4150-8121-d5a6d06b8ebf' '00a85be6-6eb0-421d-8acf-ff2dc0007e8a' '00f87de5-5fe0-4921-93ea-914d7e683266'] ['0004cfab-14fd-4e49-80ba-63a80b6bddd6' '00313ee0-9eaa-42f4-b0ab-c148ed3241cd' '00322d4d-1c29-4943-afc9-b6754be640eb' '00569f44-917d-4c86-a842-81832af98c30' '006cec2e-6ce2-4549-bffa-eadfcd1e9970']
Unzipping the Training and Testing Images
with ZipFile('stage_2_train_images.zip', 'r') as z:
z.extractall()
with ZipFile('stage_2_test_images.zip', 'r') as z:
z.extractall()
Segregating the Training Images into 3 different folders under "Training" as
- Lung Opacity
- Normal
- Not Normal
dst1 = "Training"
dst2 = "Lung Opacity"
dst3 = "Normal"
dst4 = "Not Normal"
os.mkdir(dst1)
path1 = os.path.join(dst1,dst2)
path2 = os.path.join(dst1,dst3)
path3 = os.path.join(dst1,dst4)
mode = 0o777
os.mkdir(path1, mode)
os.mkdir(path2, mode)
os.mkdir(path3, mode)
Copying Pneumonia images to "Training/Lung Opacity" folder
# Source path
source = "stage_2_train_images"
# Destination path
destination = "Training/Lung Opacity"
for item in Lung_Opacity_Unique:
file = item+".dcm"
src_file = os.path.join(source+"/"+file)
dst_file = os.path.join(destination+"/"+file)
print("src_file",src_file)
print("dst_file",dst_file)
shutil.copyfile(src_file, dst_file)
Copying Normal Images to "Training/Normal" folder
# Source path
source = "stage_2_train_images"
# Destination path
destination = "Training/Normal"
for item in Normal_Unique:
file = item+".dcm"
src_file = os.path.join(source+"/"+file)
dst_file = os.path.join(destination+"/"+file)
print("src_file",src_file)
print("dst_file",dst_file)
shutil.copyfile(src_file, dst_file)
Copying Not Normal images to "Training/Not Normal" folder
# Source path
source = "stage_2_train_images"
# Destination path
destination = "Training/Not Normal"
for item in Not_Normal_Unique:
file = item+".dcm"
src_file = os.path.join(source+"/"+file)
dst_file = os.path.join(destination+"/"+file)
print("src_file",src_file)
print("dst_file",dst_file)
shutil.copyfile(src_file, dst_file)
Cross checking the "Training/Lung Opacity" and "Training/Normal" folders to see how many images are available
Lung_opa_list = os.listdir("Training/Lung Opacity") # your directory path
Lung_opa_number_files = len(Lung_opa_list)
print('\nNumber of images in Lung Opacity class after seggregation:',Lung_opa_number_files)
Normal_list = os.listdir("Training/Normal") # your directory path
Normal_number_files = len(Normal_list)
print('\nNumber of images in Normal class after seggregation:',Normal_number_files)
Not_Normal_list = os.listdir("Training/Not Normal") # your directory path
Not_Normal_number_files = len(Not_Normal_list)
print('\nNumber of images in Not Normal class after seggregation:',Not_Normal_number_files)
Number of images in Lung Opacity class after seggregation: 6012 Number of images in Normal class after seggregation: 8851 Number of images in Not Normal class after seggregation: 11821
print("\nNo. of Unique Pneumonia PatientIds :",len(Lung_Opacity_Unique))
print("\nNo. of Unique Normal PatientIds :",len(Normal_Unique))
print("\nNo. of Unique Not Normal :",len(Not_Normal_Unique))
No. of Unique Pneumonia PatientIds : 6012 No. of Unique Normal PatientIds : 8851 No. of Unique Not Normal : 11821
All the Unique Pneumonia data rows, Normal & Not Normal data rows are matching with the corresponding Images in the folder
Maping Training images to folders and corresponding class.
image_dir_path = "Training"
paths = [path.parts[-3:] for path in
Path(image_dir_path).rglob('*.dcm')]
Images_df = pd.DataFrame(data=paths, columns=['Folder', 'Image_Class', 'Image_file'])
Images_df.head(20).style.set_properties(**{'border': '1.3px solid black','color': 'blue'})
| Folder | Image_Class | Image_file | |
|---|---|---|---|
| 0 | Training | Lung Opacity | 000db696-cf54-4385-b10b-6b16fbb3f985.dcm |
| 1 | Training | Lung Opacity | 000fe35a-2649-43d4-b027-e67796d412e0.dcm |
| 2 | Training | Lung Opacity | 001031d9-f904-4a23-b3e5-2c088acd19c6.dcm |
| 3 | Training | Lung Opacity | 001916b8-3d30-4935-a5d1-8eaddb1646cd.dcm |
| 4 | Training | Lung Opacity | 0022073f-cec8-42ec-ab5f-bc2314649235.dcm |
| 5 | Training | Lung Opacity | 002cb550-2e31-42f1-a29d-fbc279977e71.dcm |
| 6 | Training | Lung Opacity | 00436515-870c-4b36-a041-de91049b9ab4.dcm |
| 7 | Training | Lung Opacity | 00704310-78a8-4b38-8475-49f4573b2dbb.dcm |
| 8 | Training | Lung Opacity | 0087bd3a-55a7-4045-b111-b018fa52d361.dcm |
| 9 | Training | Lung Opacity | 00a05408-8291-4231-886e-13763e103161.dcm |
| 10 | Training | Lung Opacity | 00aecb01-a116-45a2-956c-08d2fa55433f.dcm |
| 11 | Training | Lung Opacity | 00c0b293-48e7-4e16-ac76-9269ba535a62.dcm |
| 12 | Training | Lung Opacity | 00eeb3c9-a892-4fac-a67a-aaa6cc7ffd5c.dcm |
| 13 | Training | Lung Opacity | 00f08de1-517e-4652-a04f-d1dc9ee48593.dcm |
| 14 | Training | Lung Opacity | 0100515c-5204-4f31-98e0-f35e4b00004a.dcm |
| 15 | Training | Lung Opacity | 0101174b-6643-4d4e-b4ba-b6d41d0ce46a.dcm |
| 16 | Training | Lung Opacity | 010ccb9f-6d46-4380-af11-84f87397a1b8.dcm |
| 17 | Training | Lung Opacity | 012a5620-d082-4bb8-9b3b-e72d8938000c.dcm |
| 18 | Training | Lung Opacity | 013c7df0-d66d-4cb1-b3bc-a70085160311.dcm |
| 19 | Training | Lung Opacity | 0174c4bb-28f5-41e3-a13f-a396badc18bd.dcm |
Images_df.tail(20).style.set_properties(**{'border': '1.3px solid black','color': 'blue'})
| Folder | Image_Class | Image_file | |
|---|---|---|---|
| 26664 | Training | Not Normal | ffb8998b-f262-4330-820d-bf48a4d8fec1.dcm |
| 26665 | Training | Not Normal | ffc01e64-ba14-4620-8016-235fc1609767.dcm |
| 26666 | Training | Not Normal | ffcfe8c1-5641-4dc3-910c-573e3227f536.dcm |
| 26667 | Training | Not Normal | ffd0c212-bfb6-41e3-b17b-b927b99d1730.dcm |
| 26668 | Training | Not Normal | ffd2bc74-f9d6-49fb-84d7-de060ee22583.dcm |
| 26669 | Training | Not Normal | ffd670a5-b6dc-4f54-928a-69b7a04662eb.dcm |
| 26670 | Training | Not Normal | ffdc771e-1f1a-47f7-b732-3e06e48c24e8.dcm |
| 26671 | Training | Not Normal | ffdc957e-6239-427d-8a54-fdf8ced3a356.dcm |
| 26672 | Training | Not Normal | ffde3879-241c-4b03-9d91-625cae6b49e8.dcm |
| 26673 | Training | Not Normal | ffde3e7e-849c-4077-bfd0-4e4498ee8817.dcm |
| 26674 | Training | Not Normal | ffe4707d-517c-4ed0-8a9f-3ad149748991.dcm |
| 26675 | Training | Not Normal | ffe9a1c4-634d-408f-9b0f-8098ffb78a4e.dcm |
| 26676 | Training | Not Normal | ffe9ab5c-9d39-4235-9ade-d725dcad6b76.dcm |
| 26677 | Training | Not Normal | ffeab8ae-f339-40d6-96f1-f1f8c97d2cb1.dcm |
| 26678 | Training | Not Normal | ffee0360-20cf-4d4a-9af5-e5c9b493c73b.dcm |
| 26679 | Training | Not Normal | fff1cc9c-3895-43be-84e1-a7aaef21002b.dcm |
| 26680 | Training | Not Normal | fff7447f-99ce-4102-87f3-9788b2459eb4.dcm |
| 26681 | Training | Not Normal | fffc95b5-605b-4226-80ab-62caec682b22.dcm |
| 26682 | Training | Not Normal | fffcff11-d018-4414-971a-a7cefa327795.dcm |
| 26683 | Training | Not Normal | fffec09e-8a4a-48b1-b33e-ab4890ccd136.dcm |
Images_df["Full_filename"] = Images_df["Folder"]+"/"+Images_df["Image_Class"]+"/"+Images_df["Image_file"]
Images_df.head(25).style.set_properties(**{'border': '1.3px solid black','color': 'blue'})
| Folder | Image_Class | Image_file | Full_filename | |
|---|---|---|---|---|
| 0 | Training | Lung Opacity | 000db696-cf54-4385-b10b-6b16fbb3f985.dcm | Training/Lung Opacity/000db696-cf54-4385-b10b-6b16fbb3f985.dcm |
| 1 | Training | Lung Opacity | 000fe35a-2649-43d4-b027-e67796d412e0.dcm | Training/Lung Opacity/000fe35a-2649-43d4-b027-e67796d412e0.dcm |
| 2 | Training | Lung Opacity | 001031d9-f904-4a23-b3e5-2c088acd19c6.dcm | Training/Lung Opacity/001031d9-f904-4a23-b3e5-2c088acd19c6.dcm |
| 3 | Training | Lung Opacity | 001916b8-3d30-4935-a5d1-8eaddb1646cd.dcm | Training/Lung Opacity/001916b8-3d30-4935-a5d1-8eaddb1646cd.dcm |
| 4 | Training | Lung Opacity | 0022073f-cec8-42ec-ab5f-bc2314649235.dcm | Training/Lung Opacity/0022073f-cec8-42ec-ab5f-bc2314649235.dcm |
| 5 | Training | Lung Opacity | 002cb550-2e31-42f1-a29d-fbc279977e71.dcm | Training/Lung Opacity/002cb550-2e31-42f1-a29d-fbc279977e71.dcm |
| 6 | Training | Lung Opacity | 00436515-870c-4b36-a041-de91049b9ab4.dcm | Training/Lung Opacity/00436515-870c-4b36-a041-de91049b9ab4.dcm |
| 7 | Training | Lung Opacity | 00704310-78a8-4b38-8475-49f4573b2dbb.dcm | Training/Lung Opacity/00704310-78a8-4b38-8475-49f4573b2dbb.dcm |
| 8 | Training | Lung Opacity | 0087bd3a-55a7-4045-b111-b018fa52d361.dcm | Training/Lung Opacity/0087bd3a-55a7-4045-b111-b018fa52d361.dcm |
| 9 | Training | Lung Opacity | 00a05408-8291-4231-886e-13763e103161.dcm | Training/Lung Opacity/00a05408-8291-4231-886e-13763e103161.dcm |
| 10 | Training | Lung Opacity | 00aecb01-a116-45a2-956c-08d2fa55433f.dcm | Training/Lung Opacity/00aecb01-a116-45a2-956c-08d2fa55433f.dcm |
| 11 | Training | Lung Opacity | 00c0b293-48e7-4e16-ac76-9269ba535a62.dcm | Training/Lung Opacity/00c0b293-48e7-4e16-ac76-9269ba535a62.dcm |
| 12 | Training | Lung Opacity | 00eeb3c9-a892-4fac-a67a-aaa6cc7ffd5c.dcm | Training/Lung Opacity/00eeb3c9-a892-4fac-a67a-aaa6cc7ffd5c.dcm |
| 13 | Training | Lung Opacity | 00f08de1-517e-4652-a04f-d1dc9ee48593.dcm | Training/Lung Opacity/00f08de1-517e-4652-a04f-d1dc9ee48593.dcm |
| 14 | Training | Lung Opacity | 0100515c-5204-4f31-98e0-f35e4b00004a.dcm | Training/Lung Opacity/0100515c-5204-4f31-98e0-f35e4b00004a.dcm |
| 15 | Training | Lung Opacity | 0101174b-6643-4d4e-b4ba-b6d41d0ce46a.dcm | Training/Lung Opacity/0101174b-6643-4d4e-b4ba-b6d41d0ce46a.dcm |
| 16 | Training | Lung Opacity | 010ccb9f-6d46-4380-af11-84f87397a1b8.dcm | Training/Lung Opacity/010ccb9f-6d46-4380-af11-84f87397a1b8.dcm |
| 17 | Training | Lung Opacity | 012a5620-d082-4bb8-9b3b-e72d8938000c.dcm | Training/Lung Opacity/012a5620-d082-4bb8-9b3b-e72d8938000c.dcm |
| 18 | Training | Lung Opacity | 013c7df0-d66d-4cb1-b3bc-a70085160311.dcm | Training/Lung Opacity/013c7df0-d66d-4cb1-b3bc-a70085160311.dcm |
| 19 | Training | Lung Opacity | 0174c4bb-28f5-41e3-a13f-a396badc18bd.dcm | Training/Lung Opacity/0174c4bb-28f5-41e3-a13f-a396badc18bd.dcm |
| 20 | Training | Lung Opacity | 018951e6-9fb3-4e92-8ce7-e3a018daf93f.dcm | Training/Lung Opacity/018951e6-9fb3-4e92-8ce7-e3a018daf93f.dcm |
| 21 | Training | Lung Opacity | 019d950b-dd38-4cf3-a686-527a75728be6.dcm | Training/Lung Opacity/019d950b-dd38-4cf3-a686-527a75728be6.dcm |
| 22 | Training | Lung Opacity | 01a6eaa6-222f-4ea8-9874-bbd89dc1a1ce.dcm | Training/Lung Opacity/01a6eaa6-222f-4ea8-9874-bbd89dc1a1ce.dcm |
| 23 | Training | Lung Opacity | 01a7353d-25bb-4ff8-916b-f50dd541dccf.dcm | Training/Lung Opacity/01a7353d-25bb-4ff8-916b-f50dd541dccf.dcm |
| 24 | Training | Lung Opacity | 01adfd2f-7bc7-4cef-ab68-a0992752b620.dcm | Training/Lung Opacity/01adfd2f-7bc7-4cef-ab68-a0992752b620.dcm |
Images_df.describe()
| Folder | Image_Class | Image_file | Full_filename | |
|---|---|---|---|---|
| count | 26684 | 26684 | 26684 | 26684 |
| unique | 1 | 3 | 26684 | 26684 |
| top | Training | Not Normal | 000db696-cf54-4385-b10b-6b16fbb3f985.dcm | Training/Lung Opacity/000db696-cf54-4385-b10b-6b16fbb3f985.dcm |
| freq | 26684 | 11821 | 1 | 1 |
# creating instance of labelencoder
labelencoder = LabelEncoder()
# Assigning numerical values and storing in another column
Images_df['Image_Class_Category'] = labelencoder.fit_transform(Images_df['Image_Class'])
Images_df['Image_Target'] = Images_df['Image_Class'].apply(lambda x: 1 if x == 'Lung Opacity' else 0)
Images_df.dtypes
Folder object Image_Class object Image_file object Full_filename object Image_Class_Category int32 Image_Target int64 dtype: object
Pickling Images_df dataframe for future reference
Images_df.to_pickle("Images_df.pkl")
Unpickling Images_dataframe
unpickled_df = pd.read_pickle("Images_df.pkl")
Images_df = unpickled_df.copy()
Images_df.dtypes
Folder object Image_Class object Image_file object Full_filename object Image_Class_Category int32 Image_Target int64 dtype: object
Images_df.head(20).style.set_properties(**{'border': '1.3px solid black','color': 'blue'})
| Folder | Image_Class | Image_file | Full_filename | Image_Class_Category | Image_Target | |
|---|---|---|---|---|---|---|
| 0 | Training | Lung Opacity | 000db696-cf54-4385-b10b-6b16fbb3f985.dcm | Training/Lung Opacity/000db696-cf54-4385-b10b-6b16fbb3f985.dcm | 0 | 1 |
| 1 | Training | Lung Opacity | 000fe35a-2649-43d4-b027-e67796d412e0.dcm | Training/Lung Opacity/000fe35a-2649-43d4-b027-e67796d412e0.dcm | 0 | 1 |
| 2 | Training | Lung Opacity | 001031d9-f904-4a23-b3e5-2c088acd19c6.dcm | Training/Lung Opacity/001031d9-f904-4a23-b3e5-2c088acd19c6.dcm | 0 | 1 |
| 3 | Training | Lung Opacity | 001916b8-3d30-4935-a5d1-8eaddb1646cd.dcm | Training/Lung Opacity/001916b8-3d30-4935-a5d1-8eaddb1646cd.dcm | 0 | 1 |
| 4 | Training | Lung Opacity | 0022073f-cec8-42ec-ab5f-bc2314649235.dcm | Training/Lung Opacity/0022073f-cec8-42ec-ab5f-bc2314649235.dcm | 0 | 1 |
| 5 | Training | Lung Opacity | 002cb550-2e31-42f1-a29d-fbc279977e71.dcm | Training/Lung Opacity/002cb550-2e31-42f1-a29d-fbc279977e71.dcm | 0 | 1 |
| 6 | Training | Lung Opacity | 00436515-870c-4b36-a041-de91049b9ab4.dcm | Training/Lung Opacity/00436515-870c-4b36-a041-de91049b9ab4.dcm | 0 | 1 |
| 7 | Training | Lung Opacity | 00704310-78a8-4b38-8475-49f4573b2dbb.dcm | Training/Lung Opacity/00704310-78a8-4b38-8475-49f4573b2dbb.dcm | 0 | 1 |
| 8 | Training | Lung Opacity | 0087bd3a-55a7-4045-b111-b018fa52d361.dcm | Training/Lung Opacity/0087bd3a-55a7-4045-b111-b018fa52d361.dcm | 0 | 1 |
| 9 | Training | Lung Opacity | 00a05408-8291-4231-886e-13763e103161.dcm | Training/Lung Opacity/00a05408-8291-4231-886e-13763e103161.dcm | 0 | 1 |
| 10 | Training | Lung Opacity | 00aecb01-a116-45a2-956c-08d2fa55433f.dcm | Training/Lung Opacity/00aecb01-a116-45a2-956c-08d2fa55433f.dcm | 0 | 1 |
| 11 | Training | Lung Opacity | 00c0b293-48e7-4e16-ac76-9269ba535a62.dcm | Training/Lung Opacity/00c0b293-48e7-4e16-ac76-9269ba535a62.dcm | 0 | 1 |
| 12 | Training | Lung Opacity | 00eeb3c9-a892-4fac-a67a-aaa6cc7ffd5c.dcm | Training/Lung Opacity/00eeb3c9-a892-4fac-a67a-aaa6cc7ffd5c.dcm | 0 | 1 |
| 13 | Training | Lung Opacity | 00f08de1-517e-4652-a04f-d1dc9ee48593.dcm | Training/Lung Opacity/00f08de1-517e-4652-a04f-d1dc9ee48593.dcm | 0 | 1 |
| 14 | Training | Lung Opacity | 0100515c-5204-4f31-98e0-f35e4b00004a.dcm | Training/Lung Opacity/0100515c-5204-4f31-98e0-f35e4b00004a.dcm | 0 | 1 |
| 15 | Training | Lung Opacity | 0101174b-6643-4d4e-b4ba-b6d41d0ce46a.dcm | Training/Lung Opacity/0101174b-6643-4d4e-b4ba-b6d41d0ce46a.dcm | 0 | 1 |
| 16 | Training | Lung Opacity | 010ccb9f-6d46-4380-af11-84f87397a1b8.dcm | Training/Lung Opacity/010ccb9f-6d46-4380-af11-84f87397a1b8.dcm | 0 | 1 |
| 17 | Training | Lung Opacity | 012a5620-d082-4bb8-9b3b-e72d8938000c.dcm | Training/Lung Opacity/012a5620-d082-4bb8-9b3b-e72d8938000c.dcm | 0 | 1 |
| 18 | Training | Lung Opacity | 013c7df0-d66d-4cb1-b3bc-a70085160311.dcm | Training/Lung Opacity/013c7df0-d66d-4cb1-b3bc-a70085160311.dcm | 0 | 1 |
| 19 | Training | Lung Opacity | 0174c4bb-28f5-41e3-a13f-a396badc18bd.dcm | Training/Lung Opacity/0174c4bb-28f5-41e3-a13f-a396badc18bd.dcm | 0 | 1 |
Function to display randomly selected Images and its labels
def show_images(passed_value):
show_df = Images_df.sample(n=passed_value)
if passed_value/5 == 0:
z = math.ceil(passed_value/5)
else:
z = math.ceil((passed_value)/5)
fig, ax = plt.subplots(z,5,constrained_layout=True)
fig.set_figheight(20)
fig.set_figwidth(20)
i = 0
j = 0
for index, row in show_df.iterrows():
image_path = row['Full_filename']
ds = dicom.dcmread(image_path)
img = ds.pixel_array
height_1 = 224
width_1 = 224
dim = (width_1, height_1)
img_resize = cv2.resize(img, dim, interpolation=cv2.INTER_LINEAR)
ax[i][j].imshow(img_resize, cmap=plt.cm.bone)
ax[i][j].set_title(row['Image_Class'])
j = j+1
if j == 5:
i = i+1
j = 0
plt.figure().clear()
plt.show()
return(show_df)
show_images(20).style.set_properties(**{'border': '1.3px solid black','color': 'blue'})
<Figure size 640x480 with 0 Axes>
| Folder | Image_Class | Image_file | Full_filename | Image_Class_Category | Image_Target | |
|---|---|---|---|---|---|---|
| 15613 | Training | Not Normal | 1f929e8d-0289-4712-806b-8894c2ba2f93.dcm | Training/Not Normal/1f929e8d-0289-4712-806b-8894c2ba2f93.dcm | 2 | 0 |
| 21995 | Training | Not Normal | a616c756-c1f4-4c3b-af81-cc51695d4fe4.dcm | Training/Not Normal/a616c756-c1f4-4c3b-af81-cc51695d4fe4.dcm | 2 | 0 |
| 21703 | Training | Not Normal | a03bf35b-b57b-4cd0-bac4-c5d7c13a168a.dcm | Training/Not Normal/a03bf35b-b57b-4cd0-bac4-c5d7c13a168a.dcm | 2 | 0 |
| 6639 | Training | Normal | 197146c0-dfe6-4938-ab89-5fbef34150fc.dcm | Training/Normal/197146c0-dfe6-4938-ab89-5fbef34150fc.dcm | 1 | 0 |
| 2660 | Training | Lung Opacity | 7860706e-b07d-4c69-b17c-03ca4f8e8a95.dcm | Training/Lung Opacity/7860706e-b07d-4c69-b17c-03ca4f8e8a95.dcm | 0 | 1 |
| 18912 | Training | Not Normal | 6a17292a-fcef-423b-a7a2-046b6923d559.dcm | Training/Not Normal/6a17292a-fcef-423b-a7a2-046b6923d559.dcm | 2 | 0 |
| 5445 | Training | Lung Opacity | de304f9b-f407-43d5-89b9-bdb07f8308b1.dcm | Training/Lung Opacity/de304f9b-f407-43d5-89b9-bdb07f8308b1.dcm | 0 | 1 |
| 23503 | Training | Not Normal | cca50e52-2b44-40ba-84e7-430aa3ff3f98.dcm | Training/Not Normal/cca50e52-2b44-40ba-84e7-430aa3ff3f98.dcm | 2 | 0 |
| 627 | Training | Lung Opacity | 1a2cef8e-99cb-43cc-b9b5-65fe5104037f.dcm | Training/Lung Opacity/1a2cef8e-99cb-43cc-b9b5-65fe5104037f.dcm | 0 | 1 |
| 20246 | Training | Not Normal | 818a1ce8-68ef-4cdd-b92d-f09ba72c692b.dcm | Training/Not Normal/818a1ce8-68ef-4cdd-b92d-f09ba72c692b.dcm | 2 | 0 |
| 2295 | Training | Lung Opacity | 67949b55-7b2b-41be-becc-70f86797db4e.dcm | Training/Lung Opacity/67949b55-7b2b-41be-becc-70f86797db4e.dcm | 0 | 1 |
| 13155 | Training | Normal | d341810f-864a-4f08-8da2-c8b0896d6db7.dcm | Training/Normal/d341810f-864a-4f08-8da2-c8b0896d6db7.dcm | 1 | 0 |
| 10606 | Training | Normal | 8fae133a-b424-48b1-96a8-a2add7b8b8a2.dcm | Training/Normal/8fae133a-b424-48b1-96a8-a2add7b8b8a2.dcm | 1 | 0 |
| 9259 | Training | Normal | 6dce4fb8-2166-4f50-be8a-8610eab74026.dcm | Training/Normal/6dce4fb8-2166-4f50-be8a-8610eab74026.dcm | 1 | 0 |
| 16778 | Training | Not Normal | 467ba515-37f4-4014-9fb1-a981daa58286.dcm | Training/Not Normal/467ba515-37f4-4014-9fb1-a981daa58286.dcm | 2 | 0 |
| 12063 | Training | Normal | b3c3a0b7-e29b-4e0c-ba93-8c00c26ba5ad.dcm | Training/Normal/b3c3a0b7-e29b-4e0c-ba93-8c00c26ba5ad.dcm | 1 | 0 |
| 16611 | Training | Not Normal | 440e0b82-41f4-4b08-a0e3-405a2831a7e0.dcm | Training/Not Normal/440e0b82-41f4-4b08-a0e3-405a2831a7e0.dcm | 2 | 0 |
| 6237 | Training | Normal | 09c0e0e5-e269-4d0c-89ea-59d1a77e2097.dcm | Training/Normal/09c0e0e5-e269-4d0c-89ea-59d1a77e2097.dcm | 1 | 0 |
| 22687 | Training | Not Normal | bb6291b5-9f76-4f55-80d2-45edf39b85be.dcm | Training/Not Normal/bb6291b5-9f76-4f55-80d2-45edf39b85be.dcm | 2 | 0 |
| 13765 | Training | Normal | e20c5b46-c709-49e9-972a-86d845c1defc.dcm | Training/Normal/e20c5b46-c709-49e9-972a-86d845c1defc.dcm | 1 | 0 |
*Using fastai to read dcm images and extract Metadata
img = 'stage_2_train_images/0004cfab-14fd-4e49-80ba-63a80b6bddd6.dcm'
sample_md = dicom.filereader.dcmread(img)
sample_md
Dataset.file_meta ------------------------------- (0002, 0000) File Meta Information Group Length UL: 202 (0002, 0001) File Meta Information Version OB: b'\x00\x01' (0002, 0002) Media Storage SOP Class UID UI: Secondary Capture Image Storage (0002, 0003) Media Storage SOP Instance UID UI: 1.2.276.0.7230010.3.1.4.8323329.28530.1517874485.775526 (0002, 0010) Transfer Syntax UID UI: JPEG Baseline (Process 1) (0002, 0012) Implementation Class UID UI: 1.2.276.0.7230010.3.0.3.6.0 (0002, 0013) Implementation Version Name SH: 'OFFIS_DCMTK_360' ------------------------------------------------- (0008, 0005) Specific Character Set CS: 'ISO_IR 100' (0008, 0016) SOP Class UID UI: Secondary Capture Image Storage (0008, 0018) SOP Instance UID UI: 1.2.276.0.7230010.3.1.4.8323329.28530.1517874485.775526 (0008, 0020) Study Date DA: '19010101' (0008, 0030) Study Time TM: '000000.00' (0008, 0050) Accession Number SH: '' (0008, 0060) Modality CS: 'CR' (0008, 0064) Conversion Type CS: 'WSD' (0008, 0090) Referring Physician's Name PN: '' (0008, 103e) Series Description LO: 'view: PA' (0010, 0010) Patient's Name PN: '0004cfab-14fd-4e49-80ba-63a80b6bddd6' (0010, 0020) Patient ID LO: '0004cfab-14fd-4e49-80ba-63a80b6bddd6' (0010, 0030) Patient's Birth Date DA: '' (0010, 0040) Patient's Sex CS: 'F' (0010, 1010) Patient's Age AS: '51' (0018, 0015) Body Part Examined CS: 'CHEST' (0018, 5101) View Position CS: 'PA' (0020, 000d) Study Instance UID UI: 1.2.276.0.7230010.3.1.2.8323329.28530.1517874485.775525 (0020, 000e) Series Instance UID UI: 1.2.276.0.7230010.3.1.3.8323329.28530.1517874485.775524 (0020, 0010) Study ID SH: '' (0020, 0011) Series Number IS: '1' (0020, 0013) Instance Number IS: '1' (0020, 0020) Patient Orientation CS: '' (0028, 0002) Samples per Pixel US: 1 (0028, 0004) Photometric Interpretation CS: 'MONOCHROME2' (0028, 0010) Rows US: 1024 (0028, 0011) Columns US: 1024 (0028, 0030) Pixel Spacing DS: [0.14300000000000002, 0.14300000000000002] (0028, 0100) Bits Allocated US: 8 (0028, 0101) Bits Stored US: 8 (0028, 0102) High Bit US: 7 (0028, 0103) Pixel Representation US: 0 (0028, 2110) Lossy Image Compression CS: '01' (0028, 2114) Lossy Image Compression Method CS: 'ISO_10918_1' (7fe0, 0010) Pixel Data OB: Array of 142006 elements
items_train = get_dicom_files('stage_2_train_images/')
dicom_train = pd.DataFrame.from_dicoms(items_train).reset_index(drop = True)
dicom_train.to_csv("dicom_train.csv", index = False)
dicom_train = pd.read_csv("dicom_train.csv")
dicom_train.columns
Index(['SpecificCharacterSet', 'SOPClassUID', 'SOPInstanceUID', 'StudyDate',
'StudyTime', 'AccessionNumber', 'Modality', 'ConversionType',
'ReferringPhysicianName', 'SeriesDescription', 'PatientName',
'PatientID', 'PatientBirthDate', 'PatientSex', 'PatientAge',
'BodyPartExamined', 'ViewPosition', 'StudyInstanceUID',
'SeriesInstanceUID', 'StudyID', 'SeriesNumber', 'InstanceNumber',
'PatientOrientation', 'SamplesPerPixel', 'PhotometricInterpretation',
'Rows', 'Columns', 'MultiPixelSpacing', 'PixelSpacing', 'PixelSpacing1',
'BitsAllocated', 'BitsStored', 'HighBit', 'PixelRepresentation',
'LossyImageCompression', 'LossyImageCompressionMethod', 'fname',
'img_min', 'img_max', 'img_mean', 'img_std', 'img_pct_window'],
dtype='object')
dicom_train.sample(20).style.set_properties(**{'border': '1.3px solid black','color': 'blue'})
| SpecificCharacterSet | SOPClassUID | SOPInstanceUID | StudyDate | StudyTime | AccessionNumber | Modality | ConversionType | ReferringPhysicianName | SeriesDescription | PatientName | PatientID | PatientBirthDate | PatientSex | PatientAge | BodyPartExamined | ViewPosition | StudyInstanceUID | SeriesInstanceUID | StudyID | SeriesNumber | InstanceNumber | PatientOrientation | SamplesPerPixel | PhotometricInterpretation | Rows | Columns | MultiPixelSpacing | PixelSpacing | PixelSpacing1 | BitsAllocated | BitsStored | HighBit | PixelRepresentation | LossyImageCompression | LossyImageCompressionMethod | fname | img_min | img_max | img_mean | img_std | img_pct_window | |
|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|
| 10461 | ISO_IR 100 | 1.2.840.10008.5.1.4.1.1.7 | 1.2.276.0.7230010.3.1.4.8323329.11678.1517874359.879033 | 19010101 | 0.000000 | nan | CR | WSD | nan | view: AP | 7200a1e9-eba0-4ced-bdd3-88544f162af0 | 7200a1e9-eba0-4ced-bdd3-88544f162af0 | nan | M | 22 | CHEST | AP | 1.2.276.0.7230010.3.1.2.8323329.11678.1517874359.879032 | 1.2.276.0.7230010.3.1.3.8323329.11678.1517874359.879031 | nan | 1 | 1 | nan | 1 | MONOCHROME2 | 1024 | 1024 | 1 | 0.139000 | 0.139000 | 8 | 8 | 7 | 0 | 1 | ISO_10918_1 | stage_2_train_images\7200a1e9-eba0-4ced-bdd3-88544f162af0.dcm | 0 | -1 | 104.294415 | 60.594108 | 0.318659 |
| 16693 | ISO_IR 100 | 1.2.840.10008.5.1.4.1.1.7 | 1.2.276.0.7230010.3.1.4.8323329.30440.1517874499.861241 | 19010101 | 0.000000 | nan | CR | WSD | nan | view: AP | a8e8e360-1c06-4932-820f-cc0066da187e | a8e8e360-1c06-4932-820f-cc0066da187e | nan | F | 66 | CHEST | AP | 1.2.276.0.7230010.3.1.2.8323329.30440.1517874499.861240 | 1.2.276.0.7230010.3.1.3.8323329.30440.1517874499.861239 | nan | 1 | 1 | nan | 1 | MONOCHROME2 | 1024 | 1024 | 1 | 0.139000 | 0.139000 | 8 | 8 | 7 | 0 | 1 | ISO_10918_1 | stage_2_train_images\a8e8e360-1c06-4932-820f-cc0066da187e.dcm | 0 | -1 | 127.899700 | 53.511569 | 0.164686 |
| 9149 | ISO_IR 100 | 1.2.840.10008.5.1.4.1.1.7 | 1.2.276.0.7230010.3.1.4.8323329.4081.1517874305.819620 | 19010101 | 0.000000 | nan | CR | WSD | nan | view: PA | 672a7b2f-f672-46c5-b876-bad17cd8a567 | 672a7b2f-f672-46c5-b876-bad17cd8a567 | nan | F | 34 | CHEST | PA | 1.2.276.0.7230010.3.1.2.8323329.4081.1517874305.819619 | 1.2.276.0.7230010.3.1.3.8323329.4081.1517874305.819618 | nan | 1 | 1 | nan | 1 | MONOCHROME2 | 1024 | 1024 | 1 | 0.194311 | 0.194311 | 8 | 8 | 7 | 0 | 1 | ISO_10918_1 | stage_2_train_images\672a7b2f-f672-46c5-b876-bad17cd8a567.dcm | 0 | -105 | 111.929241 | 28.378863 | 0.162151 |
| 18352 | ISO_IR 100 | 1.2.840.10008.5.1.4.1.1.7 | 1.2.276.0.7230010.3.1.4.8323329.29517.1517874492.355669 | 19010101 | 0.000000 | nan | CR | WSD | nan | view: PA | b7637d3c-fa05-45c1-816a-1bbb95f07af1 | b7637d3c-fa05-45c1-816a-1bbb95f07af1 | nan | F | 48 | CHEST | PA | 1.2.276.0.7230010.3.1.2.8323329.29517.1517874492.355668 | 1.2.276.0.7230010.3.1.3.8323329.29517.1517874492.355667 | nan | 1 | 1 | nan | 1 | MONOCHROME2 | 1024 | 1024 | 1 | 0.194311 | 0.194311 | 8 | 8 | 7 | 0 | 1 | ISO_10918_1 | stage_2_train_images\b7637d3c-fa05-45c1-816a-1bbb95f07af1.dcm | 0 | -1 | 101.650616 | 42.187196 | 0.139111 |
| 7953 | ISO_IR 100 | 1.2.840.10008.5.1.4.1.1.7 | 1.2.276.0.7230010.3.1.4.8323329.9763.1517874344.754173 | 19010101 | 0.000000 | nan | CR | WSD | nan | view: PA | 5d2bf120-562a-484a-a2db-65953e2fc734 | 5d2bf120-562a-484a-a2db-65953e2fc734 | nan | M | 70 | CHEST | PA | 1.2.276.0.7230010.3.1.2.8323329.9763.1517874344.754172 | 1.2.276.0.7230010.3.1.3.8323329.9763.1517874344.754171 | nan | 1 | 1 | nan | 1 | MONOCHROME2 | 1024 | 1024 | 1 | 0.168000 | 0.168000 | 8 | 8 | 7 | 0 | 1 | ISO_10918_1 | stage_2_train_images\5d2bf120-562a-484a-a2db-65953e2fc734.dcm | 5 | -1 | 180.103366 | 49.422149 | 0.044318 |
| 25042 | ISO_IR 100 | 1.2.840.10008.5.1.4.1.1.7 | 1.2.276.0.7230010.3.1.4.8323329.21164.1517874432.567653 | 19010101 | 0.000000 | nan | CR | WSD | nan | view: PA | f1d6dbca-db29-41d4-9b6f-2b30fd6e9f69 | f1d6dbca-db29-41d4-9b6f-2b30fd6e9f69 | nan | M | 20 | CHEST | PA | 1.2.276.0.7230010.3.1.2.8323329.21164.1517874432.567652 | 1.2.276.0.7230010.3.1.3.8323329.21164.1517874432.567651 | nan | 1 | 1 | nan | 1 | MONOCHROME2 | 1024 | 1024 | 1 | 0.143000 | 0.143000 | 8 | 8 | 7 | 0 | 1 | ISO_10918_1 | stage_2_train_images\f1d6dbca-db29-41d4-9b6f-2b30fd6e9f69.dcm | 0 | -1 | 106.378491 | 71.737520 | 0.247437 |
| 9792 | ISO_IR 100 | 1.2.840.10008.5.1.4.1.1.7 | 1.2.276.0.7230010.3.1.4.8323329.21830.1517874437.331503 | 19010101 | 0.000000 | nan | CR | WSD | nan | view: PA | 6c5b9330-09d6-4181-ba38-afcbbc00dc3b | 6c5b9330-09d6-4181-ba38-afcbbc00dc3b | nan | F | 56 | CHEST | PA | 1.2.276.0.7230010.3.1.2.8323329.21830.1517874437.331502 | 1.2.276.0.7230010.3.1.3.8323329.21830.1517874437.331501 | nan | 1 | 1 | nan | 1 | MONOCHROME2 | 1024 | 1024 | 1 | 0.143000 | 0.143000 | 8 | 8 | 7 | 0 | 1 | ISO_10918_1 | stage_2_train_images\6c5b9330-09d6-4181-ba38-afcbbc00dc3b.dcm | 0 | -11 | 124.677175 | 75.587729 | 0.278798 |
| 2847 | ISO_IR 100 | 1.2.840.10008.5.1.4.1.1.7 | 1.2.276.0.7230010.3.1.4.8323329.26704.1517874473.667762 | 19010101 | 0.000000 | nan | CR | WSD | nan | view: PA | 304cd9fa-df87-4fb1-bf38-53a130dfa025 | 304cd9fa-df87-4fb1-bf38-53a130dfa025 | nan | M | 56 | CHEST | PA | 1.2.276.0.7230010.3.1.2.8323329.26704.1517874473.667761 | 1.2.276.0.7230010.3.1.3.8323329.26704.1517874473.667760 | nan | 1 | 1 | nan | 1 | MONOCHROME2 | 1024 | 1024 | 1 | 0.143000 | 0.143000 | 8 | 8 | 7 | 0 | 1 | ISO_10918_1 | stage_2_train_images\304cd9fa-df87-4fb1-bf38-53a130dfa025.dcm | 0 | -9 | 124.546081 | 68.109369 | 0.294114 |
| 20320 | ISO_IR 100 | 1.2.840.10008.5.1.4.1.1.7 | 1.2.276.0.7230010.3.1.4.8323329.18186.1517874409.138029 | 19010101 | 0.000000 | nan | CR | WSD | nan | view: AP | c8f8c1af-28f3-420d-85d6-9501a8002097 | c8f8c1af-28f3-420d-85d6-9501a8002097 | nan | M | 20 | CHEST | AP | 1.2.276.0.7230010.3.1.2.8323329.18186.1517874409.138028 | 1.2.276.0.7230010.3.1.3.8323329.18186.1517874409.138027 | nan | 1 | 1 | nan | 1 | MONOCHROME2 | 1024 | 1024 | 1 | 0.139000 | 0.139000 | 8 | 8 | 7 | 0 | 1 | ISO_10918_1 | stage_2_train_images\c8f8c1af-28f3-420d-85d6-9501a8002097.dcm | 0 | -1 | 96.371264 | 53.526445 | 0.274091 |
| 2186 | ISO_IR 100 | 1.2.840.10008.5.1.4.1.1.7 | 1.2.276.0.7230010.3.1.4.8323329.13551.1517874373.268603 | 19010101 | 0.000000 | nan | CR | WSD | nan | view: AP | 1fcc4272-c40e-447c-a15f-6cdf977716a1 | 1fcc4272-c40e-447c-a15f-6cdf977716a1 | nan | F | 43 | CHEST | AP | 1.2.276.0.7230010.3.1.2.8323329.13551.1517874373.268602 | 1.2.276.0.7230010.3.1.3.8323329.13551.1517874373.268601 | nan | 1 | 1 | nan | 1 | MONOCHROME2 | 1024 | 1024 | 1 | 0.171000 | 0.171000 | 8 | 8 | 7 | 0 | 1 | ISO_10918_1 | stage_2_train_images\1fcc4272-c40e-447c-a15f-6cdf977716a1.dcm | 0 | -20 | 133.794801 | 51.329732 | 0.161374 |
| 15228 | ISO_IR 100 | 1.2.840.10008.5.1.4.1.1.7 | 1.2.276.0.7230010.3.1.4.8323329.11844.1517874361.458350 | 19010101 | 0.000000 | nan | CR | WSD | nan | view: AP | 9c6b1f79-c2e8-497a-88aa-34af955ce666 | 9c6b1f79-c2e8-497a-88aa-34af955ce666 | nan | M | 60 | CHEST | AP | 1.2.276.0.7230010.3.1.2.8323329.11844.1517874361.458349 | 1.2.276.0.7230010.3.1.3.8323329.11844.1517874361.458348 | nan | 1 | 1 | nan | 1 | MONOCHROME2 | 1024 | 1024 | 1 | 0.171000 | 0.171000 | 8 | 8 | 7 | 0 | 1 | ISO_10918_1 | stage_2_train_images\9c6b1f79-c2e8-497a-88aa-34af955ce666.dcm | 0 | -42 | 135.740139 | 38.510586 | 0.052581 |
| 6164 | ISO_IR 100 | 1.2.840.10008.5.1.4.1.1.7 | 1.2.276.0.7230010.3.1.4.8323329.5780.1517874318.157929 | 19010101 | 0.000000 | nan | CR | WSD | nan | view: AP | 4d7c5fb9-7776-4ab5-a983-13bba41b39e6 | 4d7c5fb9-7776-4ab5-a983-13bba41b39e6 | nan | F | 73 | CHEST | AP | 1.2.276.0.7230010.3.1.2.8323329.5780.1517874318.157928 | 1.2.276.0.7230010.3.1.3.8323329.5780.1517874318.157927 | nan | 1 | 1 | nan | 1 | MONOCHROME2 | 1024 | 1024 | 1 | 0.139000 | 0.139000 | 8 | 8 | 7 | 0 | 1 | ISO_10918_1 | stage_2_train_images\4d7c5fb9-7776-4ab5-a983-13bba41b39e6.dcm | 0 | -1 | 130.129486 | 52.586636 | 0.150147 |
| 23123 | ISO_IR 100 | 1.2.840.10008.5.1.4.1.1.7 | 1.2.276.0.7230010.3.1.4.8323329.9904.1517874345.620268 | 19010101 | 0.000000 | nan | CR | WSD | nan | view: PA | e1bcfa18-5fed-4a92-85fb-14438a17719b | e1bcfa18-5fed-4a92-85fb-14438a17719b | nan | M | 56 | CHEST | PA | 1.2.276.0.7230010.3.1.2.8323329.9904.1517874345.620267 | 1.2.276.0.7230010.3.1.3.8323329.9904.1517874345.620266 | nan | 1 | 1 | nan | 1 | MONOCHROME2 | 1024 | 1024 | 1 | 0.143000 | 0.143000 | 8 | 8 | 7 | 0 | 1 | ISO_10918_1 | stage_2_train_images\e1bcfa18-5fed-4a92-85fb-14438a17719b.dcm | 0 | -1 | 127.952350 | 64.711431 | 0.250209 |
| 15075 | ISO_IR 100 | 1.2.840.10008.5.1.4.1.1.7 | 1.2.276.0.7230010.3.1.4.8323329.8515.1517874337.843655 | 19010101 | 0.000000 | nan | CR | WSD | nan | view: AP | 9aff4c83-b9ce-407b-aee1-9456acb46d97 | 9aff4c83-b9ce-407b-aee1-9456acb46d97 | nan | M | 68 | CHEST | AP | 1.2.276.0.7230010.3.1.2.8323329.8515.1517874337.843654 | 1.2.276.0.7230010.3.1.3.8323329.8515.1517874337.843653 | nan | 1 | 1 | nan | 1 | MONOCHROME2 | 1024 | 1024 | 1 | 0.171000 | 0.171000 | 8 | 8 | 7 | 0 | 1 | ISO_10918_1 | stage_2_train_images\9aff4c83-b9ce-407b-aee1-9456acb46d97.dcm | 0 | -18 | 145.779552 | 41.191552 | 0.061395 |
| 25586 | ISO_IR 100 | 1.2.840.10008.5.1.4.1.1.7 | 1.2.276.0.7230010.3.1.4.8323329.9078.1517874340.993622 | 19010101 | 0.000000 | nan | CR | WSD | nan | view: PA | f6ae6a07-2f67-49cc-92ac-c84daae85ba3 | f6ae6a07-2f67-49cc-92ac-c84daae85ba3 | nan | M | 68 | CHEST | PA | 1.2.276.0.7230010.3.1.2.8323329.9078.1517874340.993621 | 1.2.276.0.7230010.3.1.3.8323329.9078.1517874340.993620 | nan | 1 | 1 | nan | 1 | MONOCHROME2 | 1024 | 1024 | 1 | 0.143000 | 0.143000 | 8 | 8 | 7 | 0 | 1 | ISO_10918_1 | stage_2_train_images\f6ae6a07-2f67-49cc-92ac-c84daae85ba3.dcm | 0 | -3 | 122.697148 | 65.806944 | 0.278937 |
| 21550 | ISO_IR 100 | 1.2.840.10008.5.1.4.1.1.7 | 1.2.276.0.7230010.3.1.4.8323329.17979.1517874407.857075 | 19010101 | 0.000000 | nan | CR | WSD | nan | view: PA | d3e9cd40-d281-46dd-a115-87b9c806ca19 | d3e9cd40-d281-46dd-a115-87b9c806ca19 | nan | F | 55 | CHEST | PA | 1.2.276.0.7230010.3.1.2.8323329.17979.1517874407.857074 | 1.2.276.0.7230010.3.1.3.8323329.17979.1517874407.857073 | nan | 1 | 1 | nan | 1 | MONOCHROME2 | 1024 | 1024 | 1 | 0.143000 | 0.143000 | 8 | 8 | 7 | 0 | 1 | ISO_10918_1 | stage_2_train_images\d3e9cd40-d281-46dd-a115-87b9c806ca19.dcm | 0 | -5 | 129.292119 | 62.033524 | 0.174953 |
| 13211 | ISO_IR 100 | 1.2.840.10008.5.1.4.1.1.7 | 1.2.276.0.7230010.3.1.4.8323329.10384.1517874351.684129 | 19010101 | 0.000000 | nan | CR | WSD | nan | view: PA | 8abcbbbd-3aa8-4dac-b1cd-932a7d0a072c | 8abcbbbd-3aa8-4dac-b1cd-932a7d0a072c | nan | M | 32 | CHEST | PA | 1.2.276.0.7230010.3.1.2.8323329.10384.1517874351.684128 | 1.2.276.0.7230010.3.1.3.8323329.10384.1517874351.684127 | nan | 1 | 1 | nan | 1 | MONOCHROME2 | 1024 | 1024 | 1 | 0.143000 | 0.143000 | 8 | 8 | 7 | 0 | 1 | ISO_10918_1 | stage_2_train_images\8abcbbbd-3aa8-4dac-b1cd-932a7d0a072c.dcm | 0 | -3 | 118.586931 | 62.055935 | 0.276836 |
| 6548 | ISO_IR 100 | 1.2.840.10008.5.1.4.1.1.7 | 1.2.276.0.7230010.3.1.4.8323329.30441.1517874499.884752 | 19010101 | 0.000000 | nan | CR | WSD | nan | view: PA | 50a23843-cbdb-4570-9033-fdd9e015d9e9 | 50a23843-cbdb-4570-9033-fdd9e015d9e9 | nan | M | 54 | CHEST | PA | 1.2.276.0.7230010.3.1.2.8323329.30441.1517874499.884751 | 1.2.276.0.7230010.3.1.3.8323329.30441.1517874499.884750 | nan | 1 | 1 | nan | 1 | MONOCHROME2 | 1024 | 1024 | 1 | 0.143000 | 0.143000 | 8 | 8 | 7 | 0 | 1 | ISO_10918_1 | stage_2_train_images\50a23843-cbdb-4570-9033-fdd9e015d9e9.dcm | 0 | -1 | 112.561371 | 68.214553 | 0.349430 |
| 6482 | ISO_IR 100 | 1.2.840.10008.5.1.4.1.1.7 | 1.2.276.0.7230010.3.1.4.8323329.28543.1517874485.845932 | 19010101 | 0.000000 | nan | CR | WSD | nan | view: AP | 5013133b-e0dc-4c38-9350-8fd512c09a97 | 5013133b-e0dc-4c38-9350-8fd512c09a97 | nan | M | 60 | CHEST | AP | 1.2.276.0.7230010.3.1.2.8323329.28543.1517874485.845931 | 1.2.276.0.7230010.3.1.3.8323329.28543.1517874485.845930 | nan | 1 | 1 | nan | 1 | MONOCHROME2 | 1024 | 1024 | 1 | 0.171000 | 0.171000 | 8 | 8 | 7 | 0 | 1 | ISO_10918_1 | stage_2_train_images\5013133b-e0dc-4c38-9350-8fd512c09a97.dcm | 0 | -27 | 132.001262 | 60.921683 | 0.180203 |
| 4338 | ISO_IR 100 | 1.2.840.10008.5.1.4.1.1.7 | 1.2.276.0.7230010.3.1.4.8323329.4823.1517874309.957480 | 19010101 | 0.000000 | nan | CR | WSD | nan | view: PA | 3d6654d3-b967-4de5-8e27-7a1c18588dee | 3d6654d3-b967-4de5-8e27-7a1c18588dee | nan | M | 54 | CHEST | PA | 1.2.276.0.7230010.3.1.2.8323329.4823.1517874309.957479 | 1.2.276.0.7230010.3.1.3.8323329.4823.1517874309.957478 | nan | 1 | 1 | nan | 1 | MONOCHROME2 | 1024 | 1024 | 1 | 0.143000 | 0.143000 | 8 | 8 | 7 | 0 | 1 | ISO_10918_1 | stage_2_train_images\3d6654d3-b967-4de5-8e27-7a1c18588dee.dcm | 0 | -6 | 112.102633 | 69.223339 | 0.300241 |
len(dicom_train)
26684
All the metadata for 26684 images have been extracted into dicom_train dataframe
train_meta_df.head(10).style.set_properties(**{'border': '1.3px solid black','color': 'blue'})
| patientId | x | y | width | height | Target | class | |
|---|---|---|---|---|---|---|---|
| 0 | 0004cfab-14fd-4e49-80ba-63a80b6bddd6 | nan | nan | nan | nan | 0 | No Lung Opacity / Not Normal |
| 1 | 00313ee0-9eaa-42f4-b0ab-c148ed3241cd | nan | nan | nan | nan | 0 | No Lung Opacity / Not Normal |
| 2 | 00322d4d-1c29-4943-afc9-b6754be640eb | nan | nan | nan | nan | 0 | No Lung Opacity / Not Normal |
| 3 | 003d8fa0-6bf1-40ed-b54c-ac657f8495c5 | nan | nan | nan | nan | 0 | Normal |
| 4 | 00436515-870c-4b36-a041-de91049b9ab4 | 264.000000 | 152.000000 | 213.000000 | 379.000000 | 1 | Lung Opacity |
| 5 | 00436515-870c-4b36-a041-de91049b9ab4 | 562.000000 | 152.000000 | 256.000000 | 453.000000 | 1 | Lung Opacity |
| 6 | 00569f44-917d-4c86-a842-81832af98c30 | nan | nan | nan | nan | 0 | No Lung Opacity / Not Normal |
| 7 | 006cec2e-6ce2-4549-bffa-eadfcd1e9970 | nan | nan | nan | nan | 0 | No Lung Opacity / Not Normal |
| 8 | 00704310-78a8-4b38-8475-49f4573b2dbb | 323.000000 | 577.000000 | 160.000000 | 104.000000 | 1 | Lung Opacity |
| 9 | 00704310-78a8-4b38-8475-49f4573b2dbb | 695.000000 | 575.000000 | 162.000000 | 137.000000 | 1 | Lung Opacity |
This was the training dataframe with patientid, bounding boxes for Lung opacity, class and Target information
print("length of train labels, class dataframe",len(train_meta_df))
print("length of metadata dataframe...........",len(dicom_train))
length of train labels, class dataframe 30227 length of metadata dataframe........... 26684
we need to merge both these dataframes to have one single dataframe with all the following information
- patientid
- bounding box info
- class
- target
- filname
- Metadata information
# Merge the metadata with training data on PatientID / patientId
dicom_metadata_final = pd.merge(train_meta_df, dicom_train, left_on='patientId', right_on='PatientID')
len(dicom_metadata_final)
30227
dicom_metadata_final.duplicated().sum()
0
dicom_metadata_final = dicom_metadata_final.drop(columns = 'PatientID')
dicom_metadata_final.sample(20).style.set_properties(**{'border': '1.3px solid black','color': 'blue'})
| patientId | x | y | width | height | Target | class | SpecificCharacterSet | SOPClassUID | SOPInstanceUID | StudyDate | StudyTime | AccessionNumber | Modality | ConversionType | ReferringPhysicianName | SeriesDescription | PatientName | PatientBirthDate | PatientSex | PatientAge | BodyPartExamined | ViewPosition | StudyInstanceUID | SeriesInstanceUID | StudyID | SeriesNumber | InstanceNumber | PatientOrientation | SamplesPerPixel | PhotometricInterpretation | Rows | Columns | MultiPixelSpacing | PixelSpacing | PixelSpacing1 | BitsAllocated | BitsStored | HighBit | PixelRepresentation | LossyImageCompression | LossyImageCompressionMethod | fname | img_min | img_max | img_mean | img_std | img_pct_window | |
|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|
| 9548 | 6850e324-6105-4e8e-961e-6ae2f5e71e66 | nan | nan | nan | nan | 0 | Normal | ISO_IR 100 | 1.2.840.10008.5.1.4.1.1.7 | 1.2.276.0.7230010.3.1.4.8323329.11171.1517874356.976841 | 19010101 | 0.000000 | nan | CR | WSD | nan | view: PA | 6850e324-6105-4e8e-961e-6ae2f5e71e66 | nan | F | 48 | CHEST | PA | 1.2.276.0.7230010.3.1.2.8323329.11171.1517874356.976840 | 1.2.276.0.7230010.3.1.3.8323329.11171.1517874356.976839 | nan | 1 | 1 | nan | 1 | MONOCHROME2 | 1024 | 1024 | 1 | 0.143000 | 0.143000 | 8 | 8 | 7 | 0 | 1 | ISO_10918_1 | stage_2_train_images\6850e324-6105-4e8e-961e-6ae2f5e71e66.dcm | 0 | -4 | 92.979630 | 72.874497 | 0.369698 |
| 1636 | 17aba977-7b64-47b0-acb0-d41abba1fcbc | nan | nan | nan | nan | 0 | Normal | ISO_IR 100 | 1.2.840.10008.5.1.4.1.1.7 | 1.2.276.0.7230010.3.1.4.8323329.10787.1517874354.42396 | 19010101 | 0.000000 | nan | CR | WSD | nan | view: PA | 17aba977-7b64-47b0-acb0-d41abba1fcbc | nan | F | 27 | CHEST | PA | 1.2.276.0.7230010.3.1.2.8323329.10787.1517874354.42395 | 1.2.276.0.7230010.3.1.3.8323329.10787.1517874354.42394 | nan | 1 | 1 | nan | 1 | MONOCHROME2 | 1024 | 1024 | 1 | 0.139000 | 0.139000 | 8 | 8 | 7 | 0 | 1 | ISO_10918_1 | stage_2_train_images\17aba977-7b64-47b0-acb0-d41abba1fcbc.dcm | 0 | -32 | 130.799788 | 54.465151 | 0.130149 |
| 17309 | a5b3340e-0054-48c2-9966-fe9e2af21da7 | nan | nan | nan | nan | 0 | No Lung Opacity / Not Normal | ISO_IR 100 | 1.2.840.10008.5.1.4.1.1.7 | 1.2.276.0.7230010.3.1.4.8323329.24477.1517874454.716740 | 19010101 | 0.000000 | nan | CR | WSD | nan | view: PA | a5b3340e-0054-48c2-9966-fe9e2af21da7 | nan | F | 43 | CHEST | PA | 1.2.276.0.7230010.3.1.2.8323329.24477.1517874454.716739 | 1.2.276.0.7230010.3.1.3.8323329.24477.1517874454.716738 | nan | 1 | 1 | nan | 1 | MONOCHROME2 | 1024 | 1024 | 1 | 0.143000 | 0.143000 | 8 | 8 | 7 | 0 | 1 | ISO_10918_1 | stage_2_train_images\a5b3340e-0054-48c2-9966-fe9e2af21da7.dcm | 0 | -7 | 150.781827 | 69.885828 | 0.174041 |
| 10714 | 71576c22-51ce-4510-98d8-6933e0b9b810 | 279.000000 | 301.000000 | 160.000000 | 297.000000 | 1 | Lung Opacity | ISO_IR 100 | 1.2.840.10008.5.1.4.1.1.7 | 1.2.276.0.7230010.3.1.4.8323329.14071.1517874377.339450 | 19010101 | 0.000000 | nan | CR | WSD | nan | view: AP | 71576c22-51ce-4510-98d8-6933e0b9b810 | nan | M | 25 | CHEST | AP | 1.2.276.0.7230010.3.1.2.8323329.14071.1517874377.339449 | 1.2.276.0.7230010.3.1.3.8323329.14071.1517874377.339448 | nan | 1 | 1 | nan | 1 | MONOCHROME2 | 1024 | 1024 | 1 | 0.168000 | 0.168000 | 8 | 8 | 7 | 0 | 1 | ISO_10918_1 | stage_2_train_images\71576c22-51ce-4510-98d8-6933e0b9b810.dcm | 0 | -1 | 95.796071 | 61.864365 | 0.230841 |
| 18086 | ab9069f0-1511-414d-b37a-2920517004d4 | 48.000000 | 692.000000 | 240.000000 | 281.000000 | 1 | Lung Opacity | ISO_IR 100 | 1.2.840.10008.5.1.4.1.1.7 | 1.2.276.0.7230010.3.1.4.8323329.1571.1517874291.616328 | 19010101 | 0.000000 | nan | CR | WSD | nan | view: AP | ab9069f0-1511-414d-b37a-2920517004d4 | nan | F | 52 | CHEST | AP | 1.2.276.0.7230010.3.1.2.8323329.1571.1517874291.616327 | 1.2.276.0.7230010.3.1.3.8323329.1571.1517874291.616326 | nan | 1 | 1 | nan | 1 | MONOCHROME2 | 1024 | 1024 | 1 | 0.168000 | 0.168000 | 8 | 8 | 7 | 0 | 1 | ISO_10918_1 | stage_2_train_images\ab9069f0-1511-414d-b37a-2920517004d4.dcm | 0 | -23 | 125.423249 | 49.520973 | 0.151056 |
| 20890 | be03b6e9-e030-4491-b23e-ac1d7efc73bb | 254.000000 | 150.000000 | 290.000000 | 272.000000 | 1 | Lung Opacity | ISO_IR 100 | 1.2.840.10008.5.1.4.1.1.7 | 1.2.276.0.7230010.3.1.4.8323329.7469.1517874331.758564 | 19010101 | 0.000000 | nan | CR | WSD | nan | view: AP | be03b6e9-e030-4491-b23e-ac1d7efc73bb | nan | M | 63 | CHEST | AP | 1.2.276.0.7230010.3.1.2.8323329.7469.1517874331.758563 | 1.2.276.0.7230010.3.1.3.8323329.7469.1517874331.758562 | nan | 1 | 1 | nan | 1 | MONOCHROME2 | 1024 | 1024 | 1 | 0.139000 | 0.139000 | 8 | 8 | 7 | 0 | 1 | ISO_10918_1 | stage_2_train_images\be03b6e9-e030-4491-b23e-ac1d7efc73bb.dcm | 0 | -1 | 132.337021 | 49.335950 | 0.149296 |
| 22523 | cb9149e2-a09d-4b4a-a644-81d8f394d679 | nan | nan | nan | nan | 0 | Normal | ISO_IR 100 | 1.2.840.10008.5.1.4.1.1.7 | 1.2.276.0.7230010.3.1.4.8323329.31000.1517874505.636643 | 19010101 | 0.000000 | nan | CR | WSD | nan | view: AP | cb9149e2-a09d-4b4a-a644-81d8f394d679 | nan | M | 43 | CHEST | AP | 1.2.276.0.7230010.3.1.2.8323329.31000.1517874505.636642 | 1.2.276.0.7230010.3.1.3.8323329.31000.1517874505.636641 | nan | 1 | 1 | nan | 1 | MONOCHROME2 | 1024 | 1024 | 1 | 0.139000 | 0.139000 | 8 | 8 | 7 | 0 | 1 | ISO_10918_1 | stage_2_train_images\cb9149e2-a09d-4b4a-a644-81d8f394d679.dcm | 0 | -1 | 104.242820 | 43.270342 | 0.247639 |
| 9396 | 66fbe46c-488b-4571-9c74-ee8742f09ccb | 671.000000 | 250.000000 | 148.000000 | 251.000000 | 1 | Lung Opacity | ISO_IR 100 | 1.2.840.10008.5.1.4.1.1.7 | 1.2.276.0.7230010.3.1.4.8323329.22260.1517874439.791321 | 19010101 | 0.000000 | nan | CR | WSD | nan | view: AP | 66fbe46c-488b-4571-9c74-ee8742f09ccb | nan | F | 52 | CHEST | AP | 1.2.276.0.7230010.3.1.2.8323329.22260.1517874439.791320 | 1.2.276.0.7230010.3.1.3.8323329.22260.1517874439.791319 | nan | 1 | 1 | nan | 1 | MONOCHROME2 | 1024 | 1024 | 1 | 0.168000 | 0.168000 | 8 | 8 | 7 | 0 | 1 | ISO_10918_1 | stage_2_train_images\66fbe46c-488b-4571-9c74-ee8742f09ccb.dcm | 0 | -1 | 110.947284 | 50.607818 | 0.242702 |
| 2186 | 2a4489f6-6f7b-46f5-a937-281206307943 | 226.000000 | 367.000000 | 168.000000 | 343.000000 | 1 | Lung Opacity | ISO_IR 100 | 1.2.840.10008.5.1.4.1.1.7 | 1.2.276.0.7230010.3.1.4.8323329.5947.1517874319.184400 | 19010101 | 0.000000 | nan | CR | WSD | nan | view: PA | 2a4489f6-6f7b-46f5-a937-281206307943 | nan | M | 38 | CHEST | PA | 1.2.276.0.7230010.3.1.2.8323329.5947.1517874319.184399 | 1.2.276.0.7230010.3.1.3.8323329.5947.1517874319.184398 | nan | 1 | 1 | nan | 1 | MONOCHROME2 | 1024 | 1024 | 1 | 0.168000 | 0.168000 | 8 | 8 | 7 | 0 | 1 | ISO_10918_1 | stage_2_train_images\2a4489f6-6f7b-46f5-a937-281206307943.dcm | 12 | -5 | 171.498249 | 53.161424 | 0.048636 |
| 29162 | 0582f503-a451-448f-87a5-a67589c5f7c5 | nan | nan | nan | nan | 0 | Normal | ISO_IR 100 | 1.2.840.10008.5.1.4.1.1.7 | 1.2.276.0.7230010.3.1.4.8323329.16732.1517874399.233064 | 19010101 | 0.000000 | nan | CR | WSD | nan | view: PA | 0582f503-a451-448f-87a5-a67589c5f7c5 | nan | F | 36 | CHEST | PA | 1.2.276.0.7230010.3.1.2.8323329.16732.1517874399.233063 | 1.2.276.0.7230010.3.1.3.8323329.16732.1517874399.233062 | nan | 1 | 1 | nan | 1 | MONOCHROME2 | 1024 | 1024 | 1 | 0.194308 | 0.194308 | 8 | 8 | 7 | 0 | 1 | ISO_10918_1 | stage_2_train_images\0582f503-a451-448f-87a5-a67589c5f7c5.dcm | 0 | -17 | 110.400094 | 28.864804 | 0.171775 |
| 27564 | f4c33ee7-952c-44df-b966-69ebc9118efa | nan | nan | nan | nan | 0 | No Lung Opacity / Not Normal | ISO_IR 100 | 1.2.840.10008.5.1.4.1.1.7 | 1.2.276.0.7230010.3.1.4.8323329.26073.1517874469.278440 | 19010101 | 0.000000 | nan | CR | WSD | nan | view: AP | f4c33ee7-952c-44df-b966-69ebc9118efa | nan | M | 25 | CHEST | AP | 1.2.276.0.7230010.3.1.2.8323329.26073.1517874469.278439 | 1.2.276.0.7230010.3.1.3.8323329.26073.1517874469.278438 | nan | 1 | 1 | nan | 1 | MONOCHROME2 | 1024 | 1024 | 1 | 0.171000 | 0.171000 | 8 | 8 | 7 | 0 | 1 | ISO_10918_1 | stage_2_train_images\f4c33ee7-952c-44df-b966-69ebc9118efa.dcm | 0 | -12 | 142.632413 | 40.547089 | 0.051129 |
| 1737 | 1854a99c-bfc7-4f4e-8396-c97eebb20106 | nan | nan | nan | nan | 0 | Normal | ISO_IR 100 | 1.2.840.10008.5.1.4.1.1.7 | 1.2.276.0.7230010.3.1.4.8323329.12646.1517874366.831723 | 19010101 | 0.000000 | nan | CR | WSD | nan | view: PA | 1854a99c-bfc7-4f4e-8396-c97eebb20106 | nan | M | 73 | CHEST | PA | 1.2.276.0.7230010.3.1.2.8323329.12646.1517874366.831722 | 1.2.276.0.7230010.3.1.3.8323329.12646.1517874366.831721 | nan | 1 | 1 | nan | 1 | MONOCHROME2 | 1024 | 1024 | 1 | 0.143000 | 0.143000 | 8 | 8 | 7 | 0 | 1 | ISO_10918_1 | stage_2_train_images\1854a99c-bfc7-4f4e-8396-c97eebb20106.dcm | 0 | -10 | 119.363234 | 73.255130 | 0.238959 |
| 25932 | e7b3fd9c-d51f-42dd-9f90-25160990faa8 | 564.000000 | 274.000000 | 223.000000 | 419.000000 | 1 | Lung Opacity | ISO_IR 100 | 1.2.840.10008.5.1.4.1.1.7 | 1.2.276.0.7230010.3.1.4.8323329.17068.1517874401.160928 | 19010101 | 0.000000 | nan | CR | WSD | nan | view: AP | e7b3fd9c-d51f-42dd-9f90-25160990faa8 | nan | M | 45 | CHEST | AP | 1.2.276.0.7230010.3.1.2.8323329.17068.1517874401.160927 | 1.2.276.0.7230010.3.1.3.8323329.17068.1517874401.160926 | nan | 1 | 1 | nan | 1 | MONOCHROME2 | 1024 | 1024 | 1 | 0.139000 | 0.139000 | 8 | 8 | 7 | 0 | 1 | ISO_10918_1 | stage_2_train_images\e7b3fd9c-d51f-42dd-9f90-25160990faa8.dcm | 0 | -1 | 113.844486 | 64.594378 | 0.258999 |
| 10468 | 6f491520-b891-48e4-9047-7afc2ad9f7bb | nan | nan | nan | nan | 0 | No Lung Opacity / Not Normal | ISO_IR 100 | 1.2.840.10008.5.1.4.1.1.7 | 1.2.276.0.7230010.3.1.4.8323329.30636.1517874503.546090 | 19010101 | 0.000000 | nan | CR | WSD | nan | view: PA | 6f491520-b891-48e4-9047-7afc2ad9f7bb | nan | F | 45 | CHEST | PA | 1.2.276.0.7230010.3.1.2.8323329.30636.1517874503.546089 | 1.2.276.0.7230010.3.1.3.8323329.30636.1517874503.546088 | nan | 1 | 1 | nan | 1 | MONOCHROME2 | 1024 | 1024 | 1 | 0.143000 | 0.143000 | 8 | 8 | 7 | 0 | 1 | ISO_10918_1 | stage_2_train_images\6f491520-b891-48e4-9047-7afc2ad9f7bb.dcm | 0 | -1 | 107.142705 | 71.923364 | 0.373092 |
| 13884 | 8aa0828d-eff2-4791-a38e-37c3f533f4c0 | 631.000000 | 255.000000 | 264.000000 | 411.000000 | 1 | Lung Opacity | ISO_IR 100 | 1.2.840.10008.5.1.4.1.1.7 | 1.2.276.0.7230010.3.1.4.8323329.11273.1517874357.558631 | 19010101 | 0.000000 | nan | CR | WSD | nan | view: AP | 8aa0828d-eff2-4791-a38e-37c3f533f4c0 | nan | M | 44 | CHEST | AP | 1.2.276.0.7230010.3.1.2.8323329.11273.1517874357.558630 | 1.2.276.0.7230010.3.1.3.8323329.11273.1517874357.558629 | nan | 1 | 1 | nan | 1 | MONOCHROME2 | 1024 | 1024 | 1 | 0.168000 | 0.168000 | 8 | 8 | 7 | 0 | 1 | ISO_10918_1 | stage_2_train_images\8aa0828d-eff2-4791-a38e-37c3f533f4c0.dcm | 0 | -19 | 160.229908 | 53.317672 | 0.101672 |
| 4671 | 407bc983-cb9f-4559-9e82-8b559e466042 | 554.000000 | 423.000000 | 224.000000 | 338.000000 | 1 | Lung Opacity | ISO_IR 100 | 1.2.840.10008.5.1.4.1.1.7 | 1.2.276.0.7230010.3.1.4.8323329.8584.1517874338.248191 | 19010101 | 0.000000 | nan | CR | WSD | nan | view: AP | 407bc983-cb9f-4559-9e82-8b559e466042 | nan | F | 25 | CHEST | AP | 1.2.276.0.7230010.3.1.2.8323329.8584.1517874338.248190 | 1.2.276.0.7230010.3.1.3.8323329.8584.1517874338.248189 | nan | 1 | 1 | nan | 1 | MONOCHROME2 | 1024 | 1024 | 1 | 0.168000 | 0.168000 | 8 | 8 | 7 | 0 | 1 | ISO_10918_1 | stage_2_train_images\407bc983-cb9f-4559-9e82-8b559e466042.dcm | 0 | -44 | 113.122311 | 46.937807 | 0.209690 |
| 7680 | 59010e44-e84f-45c4-890b-dfaa5baca9cc | nan | nan | nan | nan | 0 | No Lung Opacity / Not Normal | ISO_IR 100 | 1.2.840.10008.5.1.4.1.1.7 | 1.2.276.0.7230010.3.1.4.8323329.16397.1517874397.345873 | 19010101 | 0.000000 | nan | CR | WSD | nan | view: AP | 59010e44-e84f-45c4-890b-dfaa5baca9cc | nan | M | 77 | CHEST | AP | 1.2.276.0.7230010.3.1.2.8323329.16397.1517874397.345872 | 1.2.276.0.7230010.3.1.3.8323329.16397.1517874397.345871 | nan | 1 | 1 | nan | 1 | MONOCHROME2 | 1024 | 1024 | 1 | 0.139000 | 0.139000 | 8 | 8 | 7 | 0 | 1 | ISO_10918_1 | stage_2_train_images\59010e44-e84f-45c4-890b-dfaa5baca9cc.dcm | 0 | -1 | 123.783945 | 57.970570 | 0.211861 |
| 24298 | da490852-2373-427d-8384-9c493ce53dc6 | nan | nan | nan | nan | 0 | No Lung Opacity / Not Normal | ISO_IR 100 | 1.2.840.10008.5.1.4.1.1.7 | 1.2.276.0.7230010.3.1.4.8323329.11241.1517874357.368618 | 19010101 | 0.000000 | nan | CR | WSD | nan | view: PA | da490852-2373-427d-8384-9c493ce53dc6 | nan | F | 34 | CHEST | PA | 1.2.276.0.7230010.3.1.2.8323329.11241.1517874357.368617 | 1.2.276.0.7230010.3.1.3.8323329.11241.1517874357.368616 | nan | 1 | 1 | nan | 1 | MONOCHROME2 | 1024 | 1024 | 1 | 0.168000 | 0.168000 | 8 | 8 | 7 | 0 | 1 | ISO_10918_1 | stage_2_train_images\da490852-2373-427d-8384-9c493ce53dc6.dcm | 0 | -1 | 156.739548 | 69.235548 | 0.074763 |
| 13854 | 8a67bf0f-a6f3-414b-8c0b-16230219055d | nan | nan | nan | nan | 0 | No Lung Opacity / Not Normal | ISO_IR 100 | 1.2.840.10008.5.1.4.1.1.7 | 1.2.276.0.7230010.3.1.4.8323329.14437.1517874379.407805 | 19010101 | 0.000000 | nan | CR | WSD | nan | view: AP | 8a67bf0f-a6f3-414b-8c0b-16230219055d | nan | M | 47 | CHEST | AP | 1.2.276.0.7230010.3.1.2.8323329.14437.1517874379.407804 | 1.2.276.0.7230010.3.1.3.8323329.14437.1517874379.407803 | nan | 1 | 1 | nan | 1 | MONOCHROME2 | 1024 | 1024 | 1 | 0.139000 | 0.139000 | 8 | 8 | 7 | 0 | 1 | ISO_10918_1 | stage_2_train_images\8a67bf0f-a6f3-414b-8c0b-16230219055d.dcm | 0 | -1 | 117.263693 | 59.230636 | 0.277646 |
| 5654 | 4885207c-0763-45f4-b2ac-ea5658e3202f | nan | nan | nan | nan | 0 | Normal | ISO_IR 100 | 1.2.840.10008.5.1.4.1.1.7 | 1.2.276.0.7230010.3.1.4.8323329.4829.1517874309.981341 | 19010101 | 0.000000 | nan | CR | WSD | nan | view: AP | 4885207c-0763-45f4-b2ac-ea5658e3202f | nan | M | 60 | CHEST | AP | 1.2.276.0.7230010.3.1.2.8323329.4829.1517874309.981340 | 1.2.276.0.7230010.3.1.3.8323329.4829.1517874309.981339 | nan | 1 | 1 | nan | 1 | MONOCHROME2 | 1024 | 1024 | 1 | 0.171000 | 0.171000 | 8 | 8 | 7 | 0 | 1 | ISO_10918_1 | stage_2_train_images\4885207c-0763-45f4-b2ac-ea5658e3202f.dcm | 0 | -21 | 138.056235 | 49.979666 | 0.113164 |
Now this dicom_metadata_final dataframe has all the 30227 rows with all the information
- patientid
- bounding box co-ordinates
- class
- target
- filename
- metadata information
# Save the merged metadata file to a csv file for future reference
#dicom_metadata_final.to_csv('dicom_metadata_final.csv')
dicom_metadata_final.shape
(30227, 49)
Pulling out randomly some Images with different classes
- Lung Opacity
- Normal
- Not Normal
show_images(20).style.set_properties(**{'border': '1.3px solid black','color': 'blue'})
<Figure size 640x480 with 0 Axes>
| Folder | Image_Class | Image_file | Full_filename | Image_Class_Category | Image_Target | |
|---|---|---|---|---|---|---|
| 14756 | Training | Normal | fd7047e2-1865-4f07-89fa-75a86c79409a.dcm | Training/Normal/fd7047e2-1865-4f07-89fa-75a86c79409a.dcm | 1 | 0 |
| 1492 | Training | Lung Opacity | 3c4a89ed-4338-4410-8848-a0bf226f6e21.dcm | Training/Lung Opacity/3c4a89ed-4338-4410-8848-a0bf226f6e21.dcm | 0 | 1 |
| 22238 | Training | Not Normal | aa93c84b-e688-4e5e-a950-78dc063bcabc.dcm | Training/Not Normal/aa93c84b-e688-4e5e-a950-78dc063bcabc.dcm | 2 | 0 |
| 5304 | Training | Lung Opacity | d5be0945-b546-4640-b0a0-d2dd209122e8.dcm | Training/Lung Opacity/d5be0945-b546-4640-b0a0-d2dd209122e8.dcm | 0 | 1 |
| 1441 | Training | Lung Opacity | 3b3d0da3-cfb5-42c8-a195-6ac03626a11f.dcm | Training/Lung Opacity/3b3d0da3-cfb5-42c8-a195-6ac03626a11f.dcm | 0 | 1 |
| 18620 | Training | Not Normal | 654b2bfc-5d84-42f8-8f8c-61288c661242.dcm | Training/Not Normal/654b2bfc-5d84-42f8-8f8c-61288c661242.dcm | 2 | 0 |
| 4811 | Training | Lung Opacity | be2a8801-3cc0-4c24-a73a-a1e13ff94948.dcm | Training/Lung Opacity/be2a8801-3cc0-4c24-a73a-a1e13ff94948.dcm | 0 | 1 |
| 5480 | Training | Lung Opacity | e0af96b5-9e91-4cc7-84c7-a3d04d4c9405.dcm | Training/Lung Opacity/e0af96b5-9e91-4cc7-84c7-a3d04d4c9405.dcm | 0 | 1 |
| 23480 | Training | Not Normal | cc548ac0-8784-4430-86af-02e1c6bc36c1.dcm | Training/Not Normal/cc548ac0-8784-4430-86af-02e1c6bc36c1.dcm | 2 | 0 |
| 3243 | Training | Lung Opacity | 9341907e-9b62-4197-a0b3-3ddd3c55a884.dcm | Training/Lung Opacity/9341907e-9b62-4197-a0b3-3ddd3c55a884.dcm | 0 | 1 |
| 7622 | Training | Normal | 44e4beae-e4d8-432a-9d99-e21efc4664db.dcm | Training/Normal/44e4beae-e4d8-432a-9d99-e21efc4664db.dcm | 1 | 0 |
| 1237 | Training | Lung Opacity | 36af11b1-75ca-473d-b29c-4bfa1802a81e.dcm | Training/Lung Opacity/36af11b1-75ca-473d-b29c-4bfa1802a81e.dcm | 0 | 1 |
| 5689 | Training | Lung Opacity | ed6e3fd5-ac11-4733-befd-aac7c6db0a7b.dcm | Training/Lung Opacity/ed6e3fd5-ac11-4733-befd-aac7c6db0a7b.dcm | 0 | 1 |
| 23755 | Training | Not Normal | d086792f-5d97-473c-b0f3-291385da233f.dcm | Training/Not Normal/d086792f-5d97-473c-b0f3-291385da233f.dcm | 2 | 0 |
| 21890 | Training | Not Normal | a441e005-54e9-469a-b3ce-5c94a5e42c8c.dcm | Training/Not Normal/a441e005-54e9-469a-b3ce-5c94a5e42c8c.dcm | 2 | 0 |
| 10954 | Training | Normal | 97a54b60-1f60-4d36-a43a-ef4c19904ab1.dcm | Training/Normal/97a54b60-1f60-4d36-a43a-ef4c19904ab1.dcm | 1 | 0 |
| 6469 | Training | Normal | 153c4961-5c28-448a-9dc0-c972ce580671.dcm | Training/Normal/153c4961-5c28-448a-9dc0-c972ce580671.dcm | 1 | 0 |
| 16073 | Training | Not Normal | 38062f40-82c9-4322-b2bb-dacca835bb3b.dcm | Training/Not Normal/38062f40-82c9-4322-b2bb-dacca835bb3b.dcm | 2 | 0 |
| 24990 | Training | Not Normal | e59245f7-7879-4452-bf2c-d986d453b5ef.dcm | Training/Not Normal/e59245f7-7879-4452-bf2c-d986d453b5ef.dcm | 2 | 0 |
| 25709 | Training | Not Normal | f0fb164f-f564-434b-a3df-f74e37394ef7.dcm | Training/Not Normal/f0fb164f-f564-434b-a3df-f74e37394ef7.dcm | 2 | 0 |
plt.show()
Randomly picking up and showcasing them
- 5 Lung Opacity class images
- 5 Normal class Images
- 5 Not Normal class Images
import random
sample_data1 = random.sample(list(Lung_Opacity_Unique), 5)
sample_data2 = random.sample(list(Normal_Unique), 5)
sample_data3 = random.sample(list(Not_Normal_Unique), 5)
samples = sample_data1+sample_data2+sample_data3
fig, ax = plt.subplots(3,5)
fig.set_figheight(20)
fig.set_figwidth(20)
i = 0
j = 0
for loop_1 in samples:
pat_id_1 = loop_1
no_ind = train_meta_df[train_meta_df['patientId'] == pat_id_1].index.values
l_ind = len(no_ind)
fname = pat_id_1+".dcm"
image_path = "stage_2_train_images/"+fname
ds = dicom.dcmread(image_path)
img = ds.pixel_array
X = 0
Y = 0
width = 0
height = 0
rect = []
ax[i][j].imshow(img, cmap=plt.cm.bone)
for k in range(l_ind):
X = train_meta_df.iloc[no_ind[k],1]
Y = train_meta_df.iloc[no_ind[k],2]
width = train_meta_df.iloc[no_ind[k],3]
height = train_meta_df.iloc[no_ind[k],4]
title = train_meta_df.iloc[no_ind[k],6]
rect.append(patches.Rectangle((X, Y), width, height,linewidth = 3,edgecolor = 'r',facecolor = 'none'))
ax[i][j].add_patch(rect[k])
ax[i][j].set_title(title)
j = j+1
if j == 5:
i = i+1
j = 0
plt.figure().clear()
plt.show()
<Figure size 640x480 with 0 Axes>
Exploratory Data Analysis and Visualization of Metadata
Univariate analysis
fig, ax = plt.subplots(2,2)
fig.set_figheight(12)
fig.set_figwidth(12)
# UNIVARIATE ANALYSIS
#1. CLASS - NORMAL, LUNG OPACITY, NO LUNG OPACITY / NOT NORMAL
my_labels1 = dicom_metadata_final['class'].unique()
colors1 = ['#ff9999','#66b3ff','#ffcc99']
my_explode1 = [0.2,0,0]
ax[0][0].pie(dicom_metadata_final['class'].value_counts(sort = False),startangle=120,autopct='%1.1f%%',
colors = colors1,explode = my_explode1,labels = my_labels1)
ax[0][0].set_title("Class Pie Chart\n" + "Normal or Lung Opacity or No Lung Opacity / Not Normal", bbox={'facecolor':'0.8', 'pad':5})
ax[0][0].legend(labels = my_labels1, loc="best")
ax[0][0].axis('equal')
#2. TARGET - 0 - NO PNEUMONIA / 1 - PNEUMONIA
my_labels2 = dicom_metadata_final['Target'].unique()
colors2 = ['#D8BFD8','#ECFFDC']
my_explode2 = [0,0.2]
ax[0][1].pie(dicom_metadata_final['Target'].value_counts(sort = False),startangle=225,autopct='%1.1f%%',
labels = my_labels2,colors = colors2,explode = my_explode2)
ax[0][1].set_title("Target Pie Chart\n" + "1 - Pneumonia / 0 - No Pneumonia", bbox={'facecolor':'0.8', 'pad':5})
ax[0][1].legend(labels = my_labels2, loc="best")
ax[0][1].axis('equal')
#3. PATIENT SEX - M - MALE / F - FEMALE
my_labels3 = dicom_metadata_final['PatientSex'].unique()
colors3 = ['#66b3ff','#ff9999']
my_explode3 = [0,0.2]
ax[1][0].pie(dicom_metadata_final['PatientSex'].value_counts(sort = False),startangle=90,autopct='%1.1f%%',
labels = my_labels3,colors = colors3,explode = my_explode3)
ax[1][0].set_title("Patient Sex Pie Chart\n" + "Whether Patient is Male - M or Female - F", bbox={'facecolor':'0.8', 'pad':5})
ax[1][0].legend(labels = my_labels3, loc="best")
ax[1][0].axis('equal')
#4. VIEW POSITION - PA - POSTERIOR TO ANTERIOR / AP - ANTERIOR TO POSTERIOR
my_labels4 = dicom_metadata_final['ViewPosition'].unique()
colors4 = ['#C1E1C1','#FAC898']
my_explode4 = [0,0.2]
ax[1][1].pie(dicom_metadata_final['ViewPosition'].value_counts(sort = False),startangle=90,autopct='%1.1f%%',
labels = my_labels4,colors = colors4,explode = my_explode4)
ax[1][1].set_title("View Position Pie Chart\n" + "PA - Posterior to Anterior / AP - Anterior to Posterior", bbox={'facecolor':'0.8', 'pad':5})
ax[1][1].legend(labels = my_labels4, loc="best")
ax[1][1].axis('equal')
fig.tight_layout()
plt.show()
Observations:
- Pneumonia is about 1/3 of the cases and Non-Pneumonia are 2/3 of the cases
- More Males have got themselves checked than Females
- There is almost equal proportions of ViewPositions
# CREATE A NEW COLUMN "AGE GROUP" AND ADD TO dicom_metadata_df
# ASSIGN THE VALUE BASED ON PATIENT AGE. FOR EX: 1-20, 21-40, 41-60, 61-80, 81-100, >100
AGE = []
dicom_metadata_final['PatientAge'] = dicom_metadata_final['PatientAge'].astype("int64")
for i in np.arange(0, dicom_metadata_final['PatientAge'].value_counts().sum(),1):
if (dicom_metadata_final['PatientAge'][i] > 0) and (dicom_metadata_final['PatientAge'][i] <= 20):
AGE.append('1-20')
elif (dicom_metadata_final['PatientAge'][i] > 20) and (dicom_metadata_final['PatientAge'][i] <= 40):
AGE.append('21-40')
elif (dicom_metadata_final['PatientAge'][i] > 40) and (dicom_metadata_final['PatientAge'][i] <= 60):
AGE.append('41-60')
elif (dicom_metadata_final['PatientAge'][i] > 60) and (dicom_metadata_final['PatientAge'][i] <= 80):
AGE.append('61-80')
elif (dicom_metadata_final['PatientAge'][i] > 80) and (dicom_metadata_final['PatientAge'][i] <= 100):
AGE.append('81-100')
else :
AGE.append('>100')
AGE = pd.Series(AGE)
dicom_metadata_final['AGE_GROUP'] = AGE.values
# AGE GROUP - PATIENT AGE GROUP - [1-20, 21-40, 41-60, 61-80, 81-100, >100]
sns.histplot(data = dicom_metadata_final, x = "AGE_GROUP",multiple="dodge", shrink=.8,palette = "Spectral")
plt.title("Patient Age Pie Chart\n" + "Shows Age of patient", bbox={'facecolor':'0.8', 'pad':5})
plt.show()
There seems to be some issues while collecting Age information, there are quite a few age reported above 100,
which might not be true, there was one number even reported as 155. I guess those are mistakes hence converting Age into bins of
- 01 - 20
- 21 - 40
- 41 - 60
- 61 - 80
- 81 - 100
- greater than 100
when observed there are large number of patients between 41-60 followed by 21-40, 61-80 and relatively lesser number of patients
between 1 - 20, 81 - 100 and > 100
this observation looks logically correct since there will be very less patients in the largest age category 81-100 and > 100 and also
in the smallest age group of 1-20 (since there are greater probabilities of young people being hale and healthy)
Bi-variate analysis
fig, ax = plt.subplots(2,3)
fig.set_figheight(12)
fig.set_figwidth(12)
# BI-VARIATE ANALYSIS
#1. PATIENT SEX AND CLASS
ax[0][0].set_title("Patient Sex vs CLass" + "\n............................................", bbox={'facecolor':'0.8', 'pad':5})
sns.countplot(data = dicom_metadata_final, x = "PatientSex",hue = "class", edgecolor = ".6",palette = "Spectral",ax = ax[0][0])
#2. PATIENT SEX AND TARGET
sns.countplot(data = dicom_metadata_final, x = "PatientSex",hue = "Target", edgecolor = ".6",palette = "Spectral",ax = ax[0][1])
ax[0][1].set_title("Patient Sex vs Target" + "\n............................................", bbox={'facecolor':'0.8', 'pad':5})
#3. PATIENT AGE GROUP AND CLASS
sns.countplot(data = dicom_metadata_final, x = "AGE_GROUP",hue = "class", edgecolor = ".6",palette = "Spectral",ax = ax[0][2])
ax[0][2].set_title("Patient Age vs Class\n" + "\n............................................", bbox={'facecolor':'0.8', 'pad':5})
#4. PATIENT AGE GROUP AND TARGET
sns.countplot(data = dicom_metadata_final, x = "AGE_GROUP",hue = "Target", edgecolor = ".6",palette = "Spectral",ax = ax[1][0])
ax[1][0].set_title("Patient Age vs Target\n" + "\n............................................", bbox={'facecolor':'0.8', 'pad':5})
#5. VIEW POSITION AND CLASS
sns.countplot(data = dicom_metadata_final, x = "ViewPosition",hue = "class", edgecolor = ".6",palette = "Spectral",ax = ax[1][1])
ax[1][1].set_title("View Position vs Class\n" + "\n............................................", bbox={'facecolor':'0.8', 'pad':5})
#6. VIEW POSITION AND TARGET
sns.countplot(data = dicom_metadata_final, x = "ViewPosition",hue = "Target", edgecolor = ".6",palette = "Spectral",ax = ax[1][2])
ax[1][2].set_title("View Position vs Target\n" + "\n............................................", bbox={'facecolor':'0.8', 'pad':5})
fig.tight_layout()
plt.show()
Observations:
1. Sex Vs Class
- Since there are Male patients than Female patients as observed earlier the proportion of Classes also seem to be almost \
similar between the genders
2. Sex Vs Target
- Again since there are more Male patients, the number of Pneumonia cases are also higher in them than in the Female patients.
But more or less the proportions seem to be almost similar between the genders
3. Age group Vs Class
- As observed earlier since the number of patients are very high in the 41-60 age category, the classes are also very high in that
category. Normal cases and Lung opacity cases are almost the same in the 41-60 category with the Not normal cases being very high
- Again since 21-40 age category patients are next highest after 41-60, the number of classes are also high there. Interestingly all the
3 classes are almost similar in this age category
- 61-80 age has higher Not Normal cases than 21-40 Not normal cases
4. Age group Vs Target
- Age group 41-60 has the highest Pneumonia cases followed by 21-40 then 61-80 and then 1-20
5. View Position Vs Class
- PA - Posterior to Anterior means X-ray source passes through the back of the patient with the chest facing the Film
- AP - Anterior to Posterior means X-ray source passes through the front of the patient with the back facing the Film
- More Normal Classes are found in PA than in AP view
- Nor normal / No Lung opacity classes are almost similar in both PA and AP views
6. View Position Vs Target
- Huge number of Pneumonia cases found in AP views than in PA views
#6. Class & TARGET
fig, ax = plt.subplots(1,1)
fig.set_figheight(6)
fig.set_figwidth(8)
sns.countplot(data = dicom_metadata_final, x = "Target", hue = "class", edgecolor = ".6",palette = "Spectral",ax=ax)
ax.set_title("Class vs Target\n" + "\n............................................", bbox={'facecolor':'0.8', 'pad':5})
fig.tight_layout()
plt.show()
Observation:
- It looks pretty obvious that Lung Opacity class is shown only in Target=1
while Normal and Not Normal classes are only observed in Target = 0
dicom_metadata_final.describe(include='object').transpose().style.set_properties(**{'border': '1.3px solid black','color': 'blue'})
| count | unique | top | freq | |
|---|---|---|---|---|
| patientId | 30227 | 26684 | 3239951b-6211-4290-b237-3d9ad17176db | 4 |
| class | 30227 | 3 | No Lung Opacity / Not Normal | 11821 |
| SpecificCharacterSet | 30227 | 1 | ISO_IR 100 | 30227 |
| SOPClassUID | 30227 | 1 | 1.2.840.10008.5.1.4.1.1.7 | 30227 |
| SOPInstanceUID | 30227 | 26684 | 1.2.276.0.7230010.3.1.4.8323329.2740.1517874298.197769 | 4 |
| Modality | 30227 | 1 | CR | 30227 |
| ConversionType | 30227 | 1 | WSD | 30227 |
| SeriesDescription | 30227 | 2 | view: AP | 15297 |
| PatientName | 30227 | 26684 | 3239951b-6211-4290-b237-3d9ad17176db | 4 |
| PatientSex | 30227 | 2 | M | 17216 |
| BodyPartExamined | 30227 | 1 | CHEST | 30227 |
| ViewPosition | 30227 | 2 | AP | 15297 |
| StudyInstanceUID | 30227 | 26684 | 1.2.276.0.7230010.3.1.2.8323329.2740.1517874298.197768 | 4 |
| SeriesInstanceUID | 30227 | 26684 | 1.2.276.0.7230010.3.1.3.8323329.2740.1517874298.197767 | 4 |
| PhotometricInterpretation | 30227 | 1 | MONOCHROME2 | 30227 |
| LossyImageCompressionMethod | 30227 | 1 | ISO_10918_1 | 30227 |
| fname | 30227 | 26684 | stage_2_train_images\3239951b-6211-4290-b237-3d9ad17176db.dcm | 4 |
| AGE_GROUP | 30227 | 6 | 41-60 | 13117 |
dicom_metadata_final.describe().transpose().style.set_properties(**{'border': '1.3px solid black','color': 'blue'})
| count | mean | std | min | 25% | 50% | 75% | max | |
|---|---|---|---|---|---|---|---|---|
| x | 9555.000000 | 394.047724 | 204.574172 | 2.000000 | 207.000000 | 324.000000 | 594.000000 | 835.000000 |
| y | 9555.000000 | 366.839560 | 148.940488 | 2.000000 | 249.000000 | 365.000000 | 478.500000 | 881.000000 |
| width | 9555.000000 | 218.471376 | 59.289475 | 40.000000 | 177.000000 | 217.000000 | 259.000000 | 528.000000 |
| height | 9555.000000 | 329.269702 | 157.750755 | 45.000000 | 203.000000 | 298.000000 | 438.000000 | 942.000000 |
| Target | 30227.000000 | 0.316108 | 0.464963 | 0.000000 | 0.000000 | 0.000000 | 1.000000 | 1.000000 |
| StudyDate | 30227.000000 | 19010101.000000 | 0.000000 | 19010101.000000 | 19010101.000000 | 19010101.000000 | 19010101.000000 | 19010101.000000 |
| StudyTime | 30227.000000 | 0.000000 | 0.000000 | 0.000000 | 0.000000 | 0.000000 | 0.000000 | 0.000000 |
| AccessionNumber | 0.000000 | nan | nan | nan | nan | nan | nan | nan |
| ReferringPhysicianName | 0.000000 | nan | nan | nan | nan | nan | nan | nan |
| PatientBirthDate | 0.000000 | nan | nan | nan | nan | nan | nan | nan |
| PatientAge | 30227.000000 | 46.797764 | 16.892940 | 1.000000 | 34.000000 | 49.000000 | 59.000000 | 155.000000 |
| StudyID | 0.000000 | nan | nan | nan | nan | nan | nan | nan |
| SeriesNumber | 30227.000000 | 1.000000 | 0.000000 | 1.000000 | 1.000000 | 1.000000 | 1.000000 | 1.000000 |
| InstanceNumber | 30227.000000 | 1.000000 | 0.000000 | 1.000000 | 1.000000 | 1.000000 | 1.000000 | 1.000000 |
| PatientOrientation | 0.000000 | nan | nan | nan | nan | nan | nan | nan |
| SamplesPerPixel | 30227.000000 | 1.000000 | 0.000000 | 1.000000 | 1.000000 | 1.000000 | 1.000000 | 1.000000 |
| Rows | 30227.000000 | 1024.000000 | 0.000000 | 1024.000000 | 1024.000000 | 1024.000000 | 1024.000000 | 1024.000000 |
| Columns | 30227.000000 | 1024.000000 | 0.000000 | 1024.000000 | 1024.000000 | 1024.000000 | 1024.000000 | 1024.000000 |
| MultiPixelSpacing | 30227.000000 | 1.000000 | 0.000000 | 1.000000 | 1.000000 | 1.000000 | 1.000000 | 1.000000 |
| PixelSpacing | 30227.000000 | 0.155481 | 0.015823 | 0.115000 | 0.143000 | 0.143000 | 0.168000 | 0.198800 |
| PixelSpacing1 | 30227.000000 | 0.155481 | 0.015823 | 0.115000 | 0.143000 | 0.143000 | 0.168000 | 0.198800 |
| BitsAllocated | 30227.000000 | 8.000000 | 0.000000 | 8.000000 | 8.000000 | 8.000000 | 8.000000 | 8.000000 |
| BitsStored | 30227.000000 | 8.000000 | 0.000000 | 8.000000 | 8.000000 | 8.000000 | 8.000000 | 8.000000 |
| HighBit | 30227.000000 | 7.000000 | 0.000000 | 7.000000 | 7.000000 | 7.000000 | 7.000000 | 7.000000 |
| PixelRepresentation | 30227.000000 | 0.000000 | 0.000000 | 0.000000 | 0.000000 | 0.000000 | 0.000000 | 0.000000 |
| LossyImageCompression | 30227.000000 | 1.000000 | 0.000000 | 1.000000 | 1.000000 | 1.000000 | 1.000000 | 1.000000 |
| img_min | 30227.000000 | 0.365931 | 2.097720 | 0.000000 | 0.000000 | 0.000000 | 0.000000 | 95.000000 |
| img_max | 30227.000000 | -8.060740 | 13.975120 | -110.000000 | -10.000000 | -1.000000 | -1.000000 | -1.000000 |
| img_mean | 30227.000000 | 124.547933 | 21.551214 | 26.417231 | 110.284383 | 119.870465 | 136.768286 | 229.147834 |
| img_std | 30227.000000 | 58.048557 | 10.772212 | 18.836551 | 50.516496 | 58.984575 | 66.681578 | 99.117195 |
| img_pct_window | 30227.000000 | 0.215550 | 0.098674 | 0.000000 | 0.144041 | 0.216644 | 0.278998 | 0.919351 |
There are lots of columns in the metadata which have
- Nulls in all the rows
- the same values repeating in all the rows
- unique values in each row
All these columns will be of no importance during the Multi-variance analysis, hence picking up only the required 20 columns from
the 50 total columns
dicom_required_df = dicom_metadata_final[["patientId","x","y","width","height","Target","class","SOPInstanceUID",
"PatientSex","PatientAge","ViewPosition","StudyInstanceUID","SeriesInstanceUID","PixelSpacing",
"PixelSpacing1","img_min","img_max","img_mean","img_std","img_pct_window"]]
dicom_required_df.dtypes
patientId object x float64 y float64 width float64 height float64 Target int64 class object SOPInstanceUID object PatientSex object PatientAge int64 ViewPosition object StudyInstanceUID object SeriesInstanceUID object PixelSpacing float64 PixelSpacing1 float64 img_min int64 img_max int64 img_mean float64 img_std float64 img_pct_window float64 dtype: object
print("dicom_metadata_final shape",dicom_metadata_final.shape)
print("dicom_required_df shape",dicom_required_df.shape)
dicom_metadata_final shape (30227, 49) dicom_required_df shape (30227, 20)
dicom_required_df dataframe has the required 20 columns for Multi-variante analysis
dicom_eda_df = dicom_required_df[["patientId", "Target", "class", "PatientSex", "PatientAge", "ViewPosition"]]
dicom_eda_df.dtypes
patientId object Target int64 class object PatientSex object PatientAge int64 ViewPosition object dtype: object
fig, ax = plt.subplots(1,1,figsize=(7,7))
target_sample = dicom_required_df.sample(2000)
target_sample['xc'] = target_sample['x'] + target_sample['width'] / 2
target_sample['yc'] = target_sample['y'] + target_sample['height'] / 2
plt.title("Centers of Lung Opacity rectangles (red) over rectangles (cyan)\nSample size: 2000")
target_sample.plot.scatter(x='xc', y='yc', xlim=(0,1024), ylim=(0,1024), ax=ax, alpha=0.8, marker=".", color="red")
for i, crt_sample in target_sample.iterrows():
ax.add_patch(patches.Rectangle(xy=(crt_sample['x'], crt_sample['y']),
width=crt_sample['width'],height=crt_sample['height'],alpha=3.5e-3, color="cyan"))
plt.show()
Multi-Variate Analysis
# MULTI VARIATE ANALYSIS
#1. CORRELATION PLOT ON dicom_required_df
def CORR_PLOT(df):
plt.figure(figsize = (16,10))
df.corr()['Target'].sort_values(ascending = False).plot(kind = 'bar', figsize = (20,5))
plt.show()
CORR_PLOT(dicom_required_df)
#2. BOX PLOT ON dicom_metadata_df
dicom_required_df.boxplot(figsize=(35,15))
plt.title("Box Plot on Metadata\n" + "............................................", bbox={'facecolor':'0.8', 'pad':5})
plt.show()
#3. HEAT MAP
plt.figure(figsize = (20,20))
sns.heatmap(dicom_required_df.drop(columns = 'Target').corr(), annot = True, linewidths = 1, square = True, cmap="YlOrRd")
plt.title("Heat Map on Metadata\n" + "............................................", bbox={'facecolor':'0.8', 'pad':5})
plt.show()
There seems to be No Multi-Collinearity issues..........
# Reset Index of the DF before proceedig for bounding box
train_Lung_Opacity = train_Lung_Opacity.reset_index(drop = True)
def bound_box(passed_sample):
# Take a random sample of 5 images and print bounding box patch on images
sample_data = random.sample(list(Lung_Opacity_Unique), passed_sample)
if passed_sample/5 == 0:
z = math.ceil(passed_sample/5)
else:
z = math.ceil((passed_sample)/5)
bounding_box_data = pd.DataFrame()
fig, ax = plt.subplots(z,5,constrained_layout=True)
fig.set_figheight(20)
fig.set_figwidth(20)
i = 0
j = 0
for loop_1 in sample_data:
pat_id_1 = loop_1
no_ind = train_Lung_Opacity[train_Lung_Opacity['patientId'] == pat_id_1].index.values
l_ind = len(no_ind)
fname = pat_id_1+".dcm"
image_path = "Training/Lung Opacity/"+fname
ds = dicom.dcmread(image_path)
img = ds.pixel_array
X = 0
Y = 0
width = 0
height = 0
rect = []
for k in range(l_ind):
X = train_Lung_Opacity.iloc[no_ind[k],1]
Y = train_Lung_Opacity.iloc[no_ind[k],2]
width = train_Lung_Opacity.iloc[no_ind[k],3]
height = train_Lung_Opacity.iloc[no_ind[k],4]
bounding_box_data = bounding_box_data.append(pd.Series([pat_id_1, l_ind,X,Y,width,height]), ignore_index = True)
rect.append(patches.Rectangle((X, Y), width, height,linewidth = 3,edgecolor = 'r',facecolor = 'none'))
ax[i][j].imshow(img, cmap=plt.cm.bone)
ax[i][j].add_patch(rect[k])
j = j+1
if j == 5:
i = i+1
j = 0
plt.figure().clear()
plt.show()
bounding_box_data.columns = ['Image Name','No. Of Bounding Boxes','X','Y','Width','Height']
return(bounding_box_data)
# Add columns to the bounding_box_data
# bounding_box_data.columns = ['Patient_ID','No. of Bounding Boxes','X','Y','Width','Height']
# Save the sample bounding_box_data to a csv file for reference
# bounding_box_data.to_csv(cap_path+'bounding_box_data.csv')
bound_box(20).style.set_properties(**{'border': '1.3px solid black','color': 'blue'}).hide_index()
<Figure size 640x480 with 0 Axes>
| Image Name | No. Of Bounding Boxes | X | Y | Width | Height |
|---|---|---|---|---|---|
| 17d34b01-a492-4764-9ee7-1403ed7a3f97 | 1 | 636.000000 | 498.000000 | 278.000000 | 288.000000 |
| ba29c8c8-c116-4b8e-b8d6-ec870ca9f9e5 | 1 | 272.000000 | 457.000000 | 219.000000 | 364.000000 |
| 1b7f9ebf-bd1c-4195-9186-d6adb65945ee | 1 | 348.000000 | 313.000000 | 122.000000 | 383.000000 |
| ee8a4128-a907-475b-a363-9cd60d65e46e | 2 | 644.000000 | 434.000000 | 182.000000 | 347.000000 |
| ee8a4128-a907-475b-a363-9cd60d65e46e | 2 | 212.000000 | 365.000000 | 184.000000 | 378.000000 |
| 9ded4d8b-5e88-4470-bb20-1d55d76d6658 | 2 | 467.000000 | 513.000000 | 315.000000 | 363.000000 |
| 9ded4d8b-5e88-4470-bb20-1d55d76d6658 | 2 | 18.000000 | 264.000000 | 306.000000 | 542.000000 |
| 617d8a06-9de7-4b9e-b7c8-a9c234a44602 | 1 | 634.000000 | 483.000000 | 201.000000 | 147.000000 |
| 9d82f9f6-b84d-48d0-9ce5-68329990d7fa | 1 | 229.000000 | 348.000000 | 221.000000 | 221.000000 |
| 750d6b01-4273-43b2-8de8-500e738cb257 | 1 | 383.000000 | 351.000000 | 115.000000 | 186.000000 |
| 74340139-eebf-445a-a0a1-2b9101233a3a | 1 | 662.000000 | 468.000000 | 156.000000 | 165.000000 |
| 0c89ca38-27ca-4cd0-88cb-18093fdcb04b | 1 | 153.000000 | 369.000000 | 257.000000 | 181.000000 |
| 822ac3a2-71b5-4540-9139-2dcca518f09f | 2 | 651.000000 | 420.000000 | 265.000000 | 426.000000 |
| 822ac3a2-71b5-4540-9139-2dcca518f09f | 2 | 281.000000 | 148.000000 | 270.000000 | 770.000000 |
| b3f132cb-438b-40d4-a951-0089e8b1a840 | 2 | 166.000000 | 230.000000 | 271.000000 | 606.000000 |
| b3f132cb-438b-40d4-a951-0089e8b1a840 | 2 | 556.000000 | 345.000000 | 256.000000 | 563.000000 |
| 66ab31a6-be10-4f4e-ad4d-b4aeb9371271 | 1 | 367.000000 | 423.000000 | 131.000000 | 181.000000 |
| 844ac956-e1e2-4f27-b58c-390d6405c3a0 | 2 | 429.000000 | 59.000000 | 359.000000 | 692.000000 |
| 844ac956-e1e2-4f27-b58c-390d6405c3a0 | 2 | 16.000000 | 60.000000 | 386.000000 | 631.000000 |
| d50e7c06-768f-4943-8e00-392d664fe580 | 1 | 187.000000 | 259.000000 | 278.000000 | 432.000000 |
| 23ca0450-4138-4e7a-9489-0f5b6a91031d | 2 | 621.000000 | 399.000000 | 193.000000 | 91.000000 |
| 23ca0450-4138-4e7a-9489-0f5b6a91031d | 2 | 94.000000 | 414.000000 | 224.000000 | 134.000000 |
| 37f5c61c-c395-468f-a4ea-7108343a22cb | 2 | 166.000000 | 158.000000 | 300.000000 | 652.000000 |
| 37f5c61c-c395-468f-a4ea-7108343a22cb | 2 | 588.000000 | 178.000000 | 265.000000 | 630.000000 |
| d72cd242-ad82-480f-acd1-e1b3a793cad2 | 2 | 662.000000 | 275.000000 | 208.000000 | 339.000000 |
| d72cd242-ad82-480f-acd1-e1b3a793cad2 | 2 | 195.000000 | 409.000000 | 213.000000 | 363.000000 |
| 1ef28712-7b54-49de-838a-3a56326d26a0 | 1 | 568.000000 | 481.000000 | 194.000000 | 173.000000 |
| b62a6a88-6ad1-43cf-ad59-6d171e89b265 | 1 | 587.000000 | 340.000000 | 250.000000 | 335.000000 |
Images_df.sample(20).style.set_properties(**{'border': '1.3px solid black','color': 'blue'})
| Folder | Image_Class | Image_file | Full_filename | Image_Class_Category | Image_Target | |
|---|---|---|---|---|---|---|
| 9063 | Training | Normal | 6927a615-8ec4-4219-aa92-e368d628cb2e.dcm | Training/Normal/6927a615-8ec4-4219-aa92-e368d628cb2e.dcm | 1 | 0 |
| 13219 | Training | Normal | d5011cb2-a146-4ca6-a055-2ec956315350.dcm | Training/Normal/d5011cb2-a146-4ca6-a055-2ec956315350.dcm | 1 | 0 |
| 17121 | Training | Not Normal | 4c2749cb-7bb8-46d8-9d14-491d0035f891.dcm | Training/Not Normal/4c2749cb-7bb8-46d8-9d14-491d0035f891.dcm | 2 | 0 |
| 17925 | Training | Not Normal | 58b049d6-b48c-4edc-ac83-753591aa359d.dcm | Training/Not Normal/58b049d6-b48c-4edc-ac83-753591aa359d.dcm | 2 | 0 |
| 4927 | Training | Lung Opacity | c04f9256-7358-42ef-b843-19dbf64ef287.dcm | Training/Lung Opacity/c04f9256-7358-42ef-b843-19dbf64ef287.dcm | 0 | 1 |
| 24237 | Training | Not Normal | d89f8d29-3832-4d89-8647-d16f9079c25a.dcm | Training/Not Normal/d89f8d29-3832-4d89-8647-d16f9079c25a.dcm | 2 | 0 |
| 667 | Training | Lung Opacity | 1c44e0a4-4612-438f-9a83-8d5bf919cb67.dcm | Training/Lung Opacity/1c44e0a4-4612-438f-9a83-8d5bf919cb67.dcm | 0 | 1 |
| 23130 | Training | Not Normal | c76bb89f-781d-4c53-8685-58f981394a09.dcm | Training/Not Normal/c76bb89f-781d-4c53-8685-58f981394a09.dcm | 2 | 0 |
| 8488 | Training | Normal | 5bfa4ba1-9ea6-42c5-b44f-1a58dfe648bf.dcm | Training/Normal/5bfa4ba1-9ea6-42c5-b44f-1a58dfe648bf.dcm | 1 | 0 |
| 23058 | Training | Not Normal | c648be1a-3426-414b-80af-fd4cbdffa7f2.dcm | Training/Not Normal/c648be1a-3426-414b-80af-fd4cbdffa7f2.dcm | 2 | 0 |
| 8358 | Training | Normal | 58bf27e0-a307-4d29-a977-511340f4572f.dcm | Training/Normal/58bf27e0-a307-4d29-a977-511340f4572f.dcm | 1 | 0 |
| 19026 | Training | Not Normal | 6bb95d1b-a502-430f-ba27-550549d70de3.dcm | Training/Not Normal/6bb95d1b-a502-430f-ba27-550549d70de3.dcm | 2 | 0 |
| 25695 | Training | Not Normal | f09d8288-c4f7-480d-a24d-b1c54c5ede45.dcm | Training/Not Normal/f09d8288-c4f7-480d-a24d-b1c54c5ede45.dcm | 2 | 0 |
| 17748 | Training | Not Normal | 55467f65-236a-4b29-b7e3-6de1e1c8f973.dcm | Training/Not Normal/55467f65-236a-4b29-b7e3-6de1e1c8f973.dcm | 2 | 0 |
| 9952 | Training | Normal | 7fbf1fe8-f97d-4e7c-9ac2-65d4e06f33b4.dcm | Training/Normal/7fbf1fe8-f97d-4e7c-9ac2-65d4e06f33b4.dcm | 1 | 0 |
| 22298 | Training | Not Normal | ad1ab5c2-fe92-498e-b82f-9464f2140c43.dcm | Training/Not Normal/ad1ab5c2-fe92-498e-b82f-9464f2140c43.dcm | 2 | 0 |
| 2377 | Training | Lung Opacity | 6cba26dc-51c2-4a3a-bbd7-3af8dd5bfd65.dcm | Training/Lung Opacity/6cba26dc-51c2-4a3a-bbd7-3af8dd5bfd65.dcm | 0 | 1 |
| 8432 | Training | Normal | 5a9235a7-8034-4115-9c0a-9612ef02f968.dcm | Training/Normal/5a9235a7-8034-4115-9c0a-9612ef02f968.dcm | 1 | 0 |
| 7314 | Training | Normal | 3b81bcea-ee88-4c13-b4f6-2a53161d2b22.dcm | Training/Normal/3b81bcea-ee88-4c13-b4f6-2a53161d2b22.dcm | 1 | 0 |
| 18306 | Training | Not Normal | 5fddd203-a282-4f29-b741-dd03bfcec835.dcm | Training/Not Normal/5fddd203-a282-4f29-b741-dd03bfcec835.dcm | 2 | 0 |
Images_df.dtypes
Folder object Image_Class object Image_file object Full_filename object Image_Class_Category int32 Image_Target int64 dtype: object
Images_sample1_a = Images_df[Images_df["Image_Target"] == 0]
Images_sample2_a = Images_df[Images_df["Image_Target"] == 1]
print("No Pneumonia",len(Images_sample1_a))
print("\nImages Pneumonia",len(Images_sample2_a))
No Pneumonia 20672 Images Pneumonia 6012
sample1_a = Images_sample1_a.sample(n = 3000)
sample2_a = Images_sample2_a.sample(n = 3000)
sams_1a = [sample1_a, sample2_a]
result_1a = pd.concat(sams_1a)
print(result_1a.shape)
print(result_1a["Image_Target"].value_counts())
(6000, 6) 0 3000 1 3000 Name: Image_Target, dtype: int64
Images_6000_Target = result_1a.copy()
img_rows=224
img_cols=224
dim = (img_rows, img_cols)
X_target = []
brk = 0
i = 1 # initialisation
for img in tqdm(Images_6000_Target["Full_filename"].values):
ds_3 = dicom.dcmread(img)
img_3 = ds_3.pixel_array
rgb = apply_color_lut(img_3, palette='PET')
train_img = rgb
try:
train_img_resize = cv2.resize(train_img, dim, interpolation=cv2.INTER_LINEAR)
except:
brk +=1
print("breaking out for",img)
break
height_2, width_2, layers = train_img_resize.shape
size=(width_2,height_2)
X_target.append(train_img_resize)
i += 1
100%|██████████████████████████████████████████████████████████████████████████████| 6000/6000 [02:41<00:00, 37.26it/s]
Pickling 6000 sample 3 dimensional images based on Target (1 & 0)
fileName = "Images_X_3_target_6000.pkl"
fileObject = open(fileName, 'wb')
pkl.dump(X_target, fileObject)
fileObject.close()
Pickling Images dataframe
Images_6000_Target.to_pickle("Images_6000_Target.pkl") # pickling the proportionate target from 6000 Images dataframe
Unpickling the Images_6000_sample.pkl and creating the 6000 Images dataframe
image_pickle = open ("Images_6000_Target.pkl", "rb")
Images_6000_Target = pkl.load(image_pickle)
Images_6000_Target.sample(10).style.set_properties(**{'border': '1.3px solid black','color': 'blue'})
| Folder | Image_Class | Image_file | Full_filename | Image_Class_Category | Image_Target | |
|---|---|---|---|---|---|---|
| 17686 | Training | Not Normal | 59cdf26b-d2de-436b-9d67-4c8317ea46be.dcm | /content/drive/MyDrive/Colab Notebooks/Project/Capstone Project//Training/Not Normal/59cdf26b-d2de-436b-9d67-4c8317ea46be.dcm | 2 | 0 |
| 18263 | Training | Not Normal | 6477a41c-821b-4c21-88cf-677f73d8bcd4.dcm | /content/drive/MyDrive/Colab Notebooks/Project/Capstone Project//Training/Not Normal/6477a41c-821b-4c21-88cf-677f73d8bcd4.dcm | 2 | 0 |
| 19141 | Training | Not Normal | 71ecae41-9818-4f71-a2c6-9e237efca012.dcm | /content/drive/MyDrive/Colab Notebooks/Project/Capstone Project//Training/Not Normal/71ecae41-9818-4f71-a2c6-9e237efca012.dcm | 2 | 0 |
| 3091 | Training | Lung Opacity | 9c0fbb4b-715d-43ad-ac16-3cd798caaab3.dcm | /content/drive/MyDrive/Colab Notebooks/Project/Capstone Project//Training/Lung Opacity/9c0fbb4b-715d-43ad-ac16-3cd798caaab3.dcm | 0 | 1 |
| 4958 | Training | Lung Opacity | d60b0cba-c1bd-4741-86ef-c24760605b03.dcm | /content/drive/MyDrive/Colab Notebooks/Project/Capstone Project//Training/Lung Opacity/d60b0cba-c1bd-4741-86ef-c24760605b03.dcm | 0 | 1 |
| 14551 | Training | Normal | 0103fadb-1663-40a6-8a9e-09d626cd2091.dcm | /content/drive/MyDrive/Colab Notebooks/Project/Capstone Project//Training/Normal/0103fadb-1663-40a6-8a9e-09d626cd2091.dcm | 1 | 0 |
| 2617 | Training | Lung Opacity | 85f49d66-c50b-4f89-be34-2aee655583b3.dcm | /content/drive/MyDrive/Colab Notebooks/Project/Capstone Project//Training/Lung Opacity/85f49d66-c50b-4f89-be34-2aee655583b3.dcm | 0 | 1 |
| 16289 | Training | Not Normal | 43d0854c-a7f1-4901-addc-fa88cecf99d5.dcm | /content/drive/MyDrive/Colab Notebooks/Project/Capstone Project//Training/Not Normal/43d0854c-a7f1-4901-addc-fa88cecf99d5.dcm | 2 | 0 |
| 25559 | Training | Not Normal | f36259c8-6751-4c6f-b9d4-44fd4f86e61b.dcm | /content/drive/MyDrive/Colab Notebooks/Project/Capstone Project//Training/Not Normal/f36259c8-6751-4c6f-b9d4-44fd4f86e61b.dcm | 2 | 0 |
| 4198 | Training | Lung Opacity | b8d7b42e-8dc7-4330-a8f3-db52852f1714.dcm | /content/drive/MyDrive/Colab Notebooks/Project/Capstone Project//Training/Lung Opacity/b8d7b42e-8dc7-4330-a8f3-db52852f1714.dcm | 0 | 1 |
Unpickling the 3 dimensional X of 6000 samples
image_X_3 = open ("Images_X_3_target_6000.pkl", "rb")
X = pkl.load(image_X_3)
X[0:2]
[array([[[ 0, 39, 38],
[ 0, 35, 35],
[ 0, 34, 33],
...,
[ 0, 40, 40],
[ 0, 45, 44],
[ 0, 34, 33]],
[[ 0, 36, 35],
[ 0, 34, 33],
[ 0, 31, 30],
...,
[ 0, 54, 53],
[ 0, 58, 56],
[ 0, 85, 83]],
[[ 0, 34, 33],
[ 0, 32, 32],
[ 0, 30, 29],
...,
[ 0, 25, 24],
[ 0, 17, 17],
[ 0, 20, 19]],
...,
[[ 0, 53, 52],
[ 0, 58, 57],
[ 0, 48, 47],
...,
[ 44, 82, 170],
[ 20, 107, 146],
[ 0, 118, 116]],
[[ 0, 52, 51],
[ 0, 55, 54],
[ 0, 49, 48],
...,
[ 51, 76, 177],
[ 24, 102, 150],
[ 1, 126, 127]],
[[ 0, 55, 53],
[ 0, 48, 47],
[ 0, 44, 43],
...,
[ 57, 70, 183],
[ 28, 99, 154],
[ 1, 126, 126]]], dtype=uint8),
array([[[ 0, 3, 2],
[ 0, 7, 6],
[ 0, 3, 2],
...,
[ 0, 0, 0],
[ 0, 0, 0],
[ 0, 0, 0]],
[[ 0, 3, 2],
[ 0, 7, 6],
[ 0, 6, 5],
...,
[ 0, 0, 0],
[ 0, 0, 0],
[ 0, 0, 0]],
[[ 0, 3, 2],
[ 0, 9, 8],
[ 0, 6, 5],
...,
[ 0, 0, 0],
[ 0, 0, 0],
[ 0, 0, 0]],
...,
[[ 0, 17, 16],
[146, 17, 221],
[ 68, 60, 194],
...,
[ 0, 0, 0],
[ 0, 0, 0],
[ 0, 0, 0]],
[[ 0, 17, 16],
[119, 8, 246],
[ 99, 28, 226],
...,
[ 0, 0, 0],
[ 0, 0, 0],
[ 0, 0, 0]],
[[ 0, 15, 14],
[ 84, 43, 210],
[127, 6, 247],
...,
[ 0, 0, 0],
[ 0, 0, 0],
[ 0, 0, 0]]], dtype=uint8)]
X_tar = np.array(X)
X_tar.shape
(6000, 224, 224, 3)
print(X_tar.max())
print(X_tar.min())
255 0
y1_tar = Images_6000_Target['Image_Target']
y1_tar.value_counts()
0 3000 1 3000 Name: Image_Target, dtype: int64
y1_tar_cat = to_categorical(y1_tar, num_classes=2)
print("Shape of y1_tar_cat:", y1_tar_cat.shape)
Shape of y1_tar_cat: (6000, 2)
y1_tar_cat[0:10]
array([[1., 0.],
[1., 0.],
[1., 0.],
[1., 0.],
[1., 0.],
[1., 0.],
[1., 0.],
[1., 0.],
[1., 0.],
[1., 0.]], dtype=float32)
print("X_tar shape",X_tar.shape)
print("y1_tar_cat shape",y1_tar_cat.shape)
X_tar shape (6000, 224, 224, 3) y1_tar_cat shape (6000, 2)
Splitting into Test and Validation
X_train, X_test, y_train, y_test = train_test_split(X_tar, y1_tar_cat, test_size=.20, stratify=y1_tar_cat, random_state=1) # 80% Training and 20% Testing
print("X_train",X_train.shape)
print("y_train",y_train.shape)
print("X_test",X_test.shape)
print("y_test",y_test.shape)
X_train (4800, 224, 224, 3) y_train (4800, 2) X_test (1200, 224, 224, 3) y_test (1200, 2)
X_train1, X_val, y_train1, y_val = train_test_split(X_train, y_train, test_size=.20, stratify=y_train, random_state=1) # 80% Training and 20% Testing
print("X_train1",X_train1.shape)
print("y_train1",y_train1.shape)
print("X_val",X_val.shape)
print("y_val",y_val.shape)
X_train1 (3840, 224, 224, 3) y_train1 (3840, 2) X_val (960, 224, 224, 3) y_val (960, 2)
print(X_val[0])
print(y_val[0])
[[[ 0 107 105] [ 0 76 74] [ 0 50 49] ... [ 0 84 82] [ 0 90 88] [ 0 108 106]] [[ 0 107 105] [ 0 75 74] [ 0 48 47] ... [ 0 46 45] [ 0 47 46] [ 0 63 61]] [[ 0 107 104] [ 0 75 73] [ 0 46 45] ... [ 0 16 15] [ 0 20 18] [ 0 33 32]] ... [[ 0 18 17] [ 0 6 5] [ 0 10 9] ... [ 0 24 23] [ 0 45 44] [ 0 78 75]] [[ 0 20 19] [ 0 13 12] [ 0 18 17] ... [ 0 28 27] [ 0 47 46] [ 0 78 76]] [[ 0 28 27] [ 0 24 23] [ 0 30 29] ... [ 0 30 29] [ 0 48 47] [ 0 79 77]]] [1. 0.]
Creating a CNN Architecture
tf.keras.backend.clear_session()
# Initialize the model
model = Sequential()
# Add a Convolutional Layer with 32 filters of size 7X7, strides(2,2) and activation function as 'relu'
model.add(Conv2D(filters=32, kernel_size=7, strides=(2, 2), activation="relu", input_shape=(224, 224, 3)))
model.add(MaxPooling2D(pool_size=(2, 2)))
model.add(BatchNormalization())
# Add a Convolutional Layer with 64 filters of size 5X5, strides(2,2) and activation function as 'relu'
model.add(Conv2D(filters=64, kernel_size=5, strides=(1, 1), activation="relu"))
model.add(BatchNormalization())
model.add(MaxPooling2D(pool_size=(2, 2)))
# model.add(Dropout(rate=0.3))
# model.add(GlobalMaxPooling2D())
# Apply Dropout with 0.2 probability
# model.add(Dropout(rate=0.2))
# Add a Convolutional Layer with 64 filters of size 3X3 and activation function as 'relu'
# model.add(Conv2D(filters=64, kernel_size=3, strides=(2, 2), activation="relu"))
# model.add(MaxPooling2D(pool_size=(2, 2)))
# model.add(BatchNormalization())
# Flatten the layer
model.add(Flatten())
# Add Fully Connected Layer with 128 units and activation function as 'relu'
model.add(Dense(128, activation="relu"))
model.add(Dense(64, activation="relu"))
# model.add(Dropout(rate=0.3))
# model.add(BatchNormalization())
# model.add(Dense(128, activation="relu"))
#Add Fully Connected Layer with 3 units and activation function as 'softmax'
model.add(Dense(2, activation="sigmoid"))
optimizer = Adam(learning_rate=0.01)
model.compile(optimizer = optimizer , loss = "binary_crossentropy", metrics=["accuracy"])
model.summary()
Model: "sequential"
_________________________________________________________________
Layer (type) Output Shape Param #
=================================================================
conv2d (Conv2D) (None, 109, 109, 32) 4736
max_pooling2d (MaxPooling2D (None, 54, 54, 32) 0
)
batch_normalization (BatchN (None, 54, 54, 32) 128
ormalization)
conv2d_1 (Conv2D) (None, 50, 50, 64) 51264
batch_normalization_1 (Batc (None, 50, 50, 64) 256
hNormalization)
max_pooling2d_1 (MaxPooling (None, 25, 25, 64) 0
2D)
flatten (Flatten) (None, 40000) 0
dense (Dense) (None, 128) 5120128
dense_1 (Dense) (None, 64) 8256
dense_2 (Dense) (None, 2) 130
=================================================================
Total params: 5,184,898
Trainable params: 5,184,706
Non-trainable params: 192
_________________________________________________________________
batch_size = 64
nb_epochs = 10
history = model.fit(X_train1, y_train1,
batch_size=batch_size,
epochs=nb_epochs,
validation_data=(X_val, y_val),
initial_epoch=0, callbacks=[EarlyStopping(monitor='val_loss', patience=2, min_delta=0.001)])
Epoch 1/10 60/60 [==============================] - 34s 550ms/step - loss: 4.3237 - accuracy: 0.6383 - val_loss: 14.4939 - val_accuracy: 0.5354 Epoch 2/10 60/60 [==============================] - 37s 609ms/step - loss: 0.6385 - accuracy: 0.7107 - val_loss: 1.2404 - val_accuracy: 0.6750 Epoch 3/10 60/60 [==============================] - 36s 603ms/step - loss: 0.5882 - accuracy: 0.7247 - val_loss: 0.7027 - val_accuracy: 0.6833 Epoch 4/10 60/60 [==============================] - 36s 600ms/step - loss: 0.5466 - accuracy: 0.7294 - val_loss: 0.6329 - val_accuracy: 0.7052 Epoch 5/10 60/60 [==============================] - 36s 596ms/step - loss: 0.5267 - accuracy: 0.7479 - val_loss: 0.6261 - val_accuracy: 0.7188 Epoch 6/10 60/60 [==============================] - 36s 592ms/step - loss: 0.5200 - accuracy: 0.7443 - val_loss: 0.6133 - val_accuracy: 0.7292 Epoch 7/10 60/60 [==============================] - 36s 596ms/step - loss: 0.5267 - accuracy: 0.7487 - val_loss: 0.6255 - val_accuracy: 0.7167 Epoch 8/10 60/60 [==============================] - 36s 603ms/step - loss: 0.5303 - accuracy: 0.7495 - val_loss: 0.5592 - val_accuracy: 0.7406 Epoch 9/10 60/60 [==============================] - 36s 599ms/step - loss: 0.4858 - accuracy: 0.7711 - val_loss: 0.6025 - val_accuracy: 0.7260 Epoch 10/10 60/60 [==============================] - 35s 590ms/step - loss: 0.4649 - accuracy: 0.7828 - val_loss: 0.6397 - val_accuracy: 0.7344
plt.figure(figsize=(6,4))
plt.plot(history.history['loss'], label='train')
plt.plot(history.history['val_loss'], label='test')
plt.legend()
plt.show()
plt.figure(figsize=(6,4))
plt.plot(history.history['accuracy'], label='train')
plt.plot(history.history['val_accuracy'], label='test')
plt.legend()
plt.show()
tf.keras.backend.clear_session()
# second model
model_2 = Sequential()
model_2.add(Conv2D(filters=32, kernel_size=7, strides=(2, 2), activation="relu", input_shape=(224, 224, 3)))
model_2.add(MaxPooling2D(pool_size=(2, 2)))
model_2.add(BatchNormalization())
model_2.add(Conv2D(filters=64, kernel_size=5, strides=(1, 1), activation="relu"))
model_2.add(MaxPooling2D(pool_size=(2, 2)))
model_2.add(BatchNormalization())
#model_2.add(Dropout(rate=0.3))
model_2.add(Conv2D(filters=128, kernel_size=3, strides=(1, 1), activation="relu"))
model_2.add(MaxPooling2D(pool_size=(2, 2)))
model_2.add(BatchNormalization())
model_2.add(Flatten())
model_2.add(Dense(128, activation="relu"))
# model_2.add(BatchNormalization())
model_2.add(Dense(84, activation="relu"))
# model_2.add(BatchNormalization())
model_2.add(Dense(42, activation="relu"))
model_2.add(Dense(2, activation="sigmoid"))
optimizer = Adam(learning_rate=0.001)
model_2.compile(optimizer = optimizer , loss = "binary_crossentropy", metrics=["accuracy"])
model_2.summary()
Model: "sequential"
_________________________________________________________________
Layer (type) Output Shape Param #
=================================================================
conv2d (Conv2D) (None, 109, 109, 32) 4736
max_pooling2d (MaxPooling2D (None, 54, 54, 32) 0
)
batch_normalization (BatchN (None, 54, 54, 32) 128
ormalization)
conv2d_1 (Conv2D) (None, 50, 50, 64) 51264
max_pooling2d_1 (MaxPooling (None, 25, 25, 64) 0
2D)
batch_normalization_1 (Batc (None, 25, 25, 64) 256
hNormalization)
conv2d_2 (Conv2D) (None, 23, 23, 128) 73856
max_pooling2d_2 (MaxPooling (None, 11, 11, 128) 0
2D)
batch_normalization_2 (Batc (None, 11, 11, 128) 512
hNormalization)
flatten (Flatten) (None, 15488) 0
dense (Dense) (None, 128) 1982592
dense_1 (Dense) (None, 84) 10836
dense_2 (Dense) (None, 42) 3570
dense_3 (Dense) (None, 2) 86
=================================================================
Total params: 2,127,836
Trainable params: 2,127,388
Non-trainable params: 448
_________________________________________________________________
batch_size = 64
nb_epochs = 20
history_2 = model_2.fit(X_train1, y_train1,
batch_size=batch_size,
epochs=nb_epochs,
validation_data=(X_val, y_val),
initial_epoch=0, callbacks=[EarlyStopping(monitor='val_loss', patience=3, min_delta=0.0001)])
Epoch 1/20 60/60 [==============================] - 35s 576ms/step - loss: 0.6190 - accuracy: 0.6997 - val_loss: 1.3135 - val_accuracy: 0.5698 Epoch 2/20 60/60 [==============================] - 37s 610ms/step - loss: 0.5663 - accuracy: 0.7219 - val_loss: 0.6272 - val_accuracy: 0.6802 Epoch 3/20 60/60 [==============================] - 37s 611ms/step - loss: 0.5180 - accuracy: 0.7437 - val_loss: 0.5994 - val_accuracy: 0.7271 Epoch 4/20 60/60 [==============================] - 37s 609ms/step - loss: 0.4965 - accuracy: 0.7594 - val_loss: 0.6675 - val_accuracy: 0.6896 Epoch 5/20 60/60 [==============================] - 36s 607ms/step - loss: 0.4684 - accuracy: 0.7729 - val_loss: 0.6723 - val_accuracy: 0.6854 Epoch 6/20 60/60 [==============================] - 37s 612ms/step - loss: 0.4272 - accuracy: 0.7964 - val_loss: 0.7234 - val_accuracy: 0.6729
plt.figure(figsize=(6,4))
plt.plot(history_2.history['loss'], label='train')
plt.plot(history_2.history['val_loss'], label='test')
plt.legend()
plt.show()
plt.figure(figsize=(6,4))
plt.plot(history_2.history['accuracy'], label='train')
plt.plot(history_2.history['val_accuracy'], label='test')
plt.legend()
plt.show()
tf.keras.backend.clear_session()
# second model
model_2a = Sequential()
model_2a.add(Conv2D(filters=32, kernel_size=5, strides=(1, 1), activation="relu", input_shape=(224, 224, 3)))
model_2a.add(MaxPooling2D(pool_size=(2, 2)))
model_2a.add(BatchNormalization())
model_2a.add(Conv2D(filters=64, kernel_size=5, strides=(1, 1), activation="relu"))
model_2a.add(MaxPooling2D(pool_size=(2, 2)))
model_2a.add(BatchNormalization())
model_2a.add(Conv2D(filters=128, kernel_size=3, strides=(1, 1), activation="relu"))
model_2a.add(MaxPooling2D(pool_size=(2, 2)))
model_2a.add(BatchNormalization())
model_2a.add(Dropout(rate=0.2))
model_2a.add(Conv2D(filters=256, kernel_size=3, strides=(1, 1), activation="relu"))
model_2a.add(MaxPooling2D(pool_size=(2, 2)))
model_2a.add(BatchNormalization())
model_2a.add(Dropout(rate=0.2))
#model_2a.add(Conv2D(filters=512, kernel_size=3, strides=(1, 1), activation="relu"))
#model_2a.add(MaxPooling2D(pool_size=(2, 2)))
#model_2a.add(BatchNormalization())
model_2a.add(Flatten())
#model_2a.add(Dense(512, activation="relu"))
# model_2a.add(Dropout(rate=0.7))
model_2a.add(Dense(256, activation="relu"))
model_2a.add(Dropout(rate=0.25))
model_2a.add(Dense(128, activation="relu"))
# model_2a.add(Dropout(rate=0.3))
model_2a.add(Dense(64, activation="relu"))
model_2a.add(Dense(32, activation="relu"))
model_2a.add(Dense(2, activation="sigmoid"))
optimizer = Adam(learning_rate=0.0001)
model_2a.compile(optimizer = optimizer , loss = "categorical_crossentropy", metrics=["accuracy"])
model_2a.summary()
Model: "sequential"
_________________________________________________________________
Layer (type) Output Shape Param #
=================================================================
conv2d (Conv2D) (None, 220, 220, 32) 2432
max_pooling2d (MaxPooling2D (None, 110, 110, 32) 0
)
batch_normalization (BatchN (None, 110, 110, 32) 128
ormalization)
conv2d_1 (Conv2D) (None, 106, 106, 64) 51264
max_pooling2d_1 (MaxPooling (None, 53, 53, 64) 0
2D)
batch_normalization_1 (Batc (None, 53, 53, 64) 256
hNormalization)
conv2d_2 (Conv2D) (None, 51, 51, 128) 73856
max_pooling2d_2 (MaxPooling (None, 25, 25, 128) 0
2D)
batch_normalization_2 (Batc (None, 25, 25, 128) 512
hNormalization)
dropout (Dropout) (None, 25, 25, 128) 0
conv2d_3 (Conv2D) (None, 23, 23, 256) 295168
max_pooling2d_3 (MaxPooling (None, 11, 11, 256) 0
2D)
batch_normalization_3 (Batc (None, 11, 11, 256) 1024
hNormalization)
dropout_1 (Dropout) (None, 11, 11, 256) 0
flatten (Flatten) (None, 30976) 0
dense (Dense) (None, 256) 7930112
dropout_2 (Dropout) (None, 256) 0
dense_1 (Dense) (None, 128) 32896
dense_2 (Dense) (None, 64) 8256
dense_3 (Dense) (None, 32) 2080
dense_4 (Dense) (None, 2) 66
=================================================================
Total params: 8,398,050
Trainable params: 8,397,090
Non-trainable params: 960
_________________________________________________________________
batch_size = 54
nb_epochs = 20
history_2a = model_2a.fit(X_train1, y_train1,
batch_size=batch_size,
epochs=nb_epochs,
validation_data=(X_val, y_val),
initial_epoch=0, callbacks=[EarlyStopping(monitor='val_loss', patience=3, min_delta=0.01)])
Epoch 1/20 72/72 [==============================] - 174s 2s/step - loss: 0.6146 - accuracy: 0.6885 - val_loss: 0.6254 - val_accuracy: 0.6854 Epoch 2/20 72/72 [==============================] - 176s 2s/step - loss: 0.5556 - accuracy: 0.7232 - val_loss: 0.5876 - val_accuracy: 0.6875 Epoch 3/20 72/72 [==============================] - 175s 2s/step - loss: 0.5299 - accuracy: 0.7414 - val_loss: 0.5784 - val_accuracy: 0.6969 Epoch 4/20 72/72 [==============================] - 176s 2s/step - loss: 0.4800 - accuracy: 0.7680 - val_loss: 0.5746 - val_accuracy: 0.7240 Epoch 5/20 72/72 [==============================] - 176s 2s/step - loss: 0.4530 - accuracy: 0.7831 - val_loss: 0.5635 - val_accuracy: 0.7333 Epoch 6/20 72/72 [==============================] - 176s 2s/step - loss: 0.4168 - accuracy: 0.8057 - val_loss: 0.6096 - val_accuracy: 0.7104 Epoch 7/20 72/72 [==============================] - 176s 2s/step - loss: 0.3621 - accuracy: 0.8320 - val_loss: 0.6227 - val_accuracy: 0.7031 Epoch 8/20 72/72 [==============================] - 175s 2s/step - loss: 0.3344 - accuracy: 0.8510 - val_loss: 0.6787 - val_accuracy: 0.7063
plt.figure(figsize=(6,4))
plt.plot(history_2a.history['loss'], label='train')
plt.plot(history_2a.history['val_loss'], label='test')
plt.legend()
plt.show()
plt.figure(figsize=(6,4))
plt.plot(history_2a.history['accuracy'], label='train')
plt.plot(history_2a.history['val_accuracy'], label='test')
plt.legend()
plt.show()
tf.keras.backend.clear_session()
# second model
model_3 = Sequential()
model_3.add(Conv2D(filters=32, kernel_size=5, strides=(1, 1), activation="relu", input_shape=(224, 224, 3)))
model_3.add(MaxPooling2D(pool_size=(2, 2)))
model_3.add(Conv2D(filters=64, kernel_size=5, strides=(1, 1), kernel_initializer='normal', activation="relu"))
model_3.add(MaxPooling2D(pool_size=(2, 2)))
model_3.add(Conv2D(filters=128, kernel_size=3, strides=(1, 1), kernel_initializer='normal', activation="relu"))
model_3.add(MaxPooling2D(pool_size=(2, 2)))
model_3.add(BatchNormalization())
model_3.add(Conv2D(filters=256, kernel_size=3, strides=(1, 1), kernel_initializer='normal', activation="relu"))
model_3.add(MaxPooling2D(pool_size=(2, 2)))
model_3.add(Dropout(rate=0.3))
model_3.add(Flatten())
model_3.add(Dense(128, kernel_initializer='normal', activation="relu"))
model_3.add(Dense(84, kernel_initializer='normal', activation="relu"))
model_3.add(Dense(42, kernel_initializer='normal', activation="relu"))
model_3.add(Dense(2, activation="sigmoid"))
optimizer = Adam(learning_rate=0.0001)
model_3.compile(optimizer = optimizer , loss = "binary_crossentropy", metrics=["accuracy"])
model_3.summary()
Model: "sequential"
_________________________________________________________________
Layer (type) Output Shape Param #
=================================================================
conv2d (Conv2D) (None, 220, 220, 32) 2432
max_pooling2d (MaxPooling2D (None, 110, 110, 32) 0
)
conv2d_1 (Conv2D) (None, 106, 106, 64) 51264
max_pooling2d_1 (MaxPooling (None, 53, 53, 64) 0
2D)
conv2d_2 (Conv2D) (None, 51, 51, 128) 73856
max_pooling2d_2 (MaxPooling (None, 25, 25, 128) 0
2D)
batch_normalization (BatchN (None, 25, 25, 128) 512
ormalization)
conv2d_3 (Conv2D) (None, 23, 23, 256) 295168
max_pooling2d_3 (MaxPooling (None, 11, 11, 256) 0
2D)
dropout (Dropout) (None, 11, 11, 256) 0
flatten (Flatten) (None, 30976) 0
dense (Dense) (None, 128) 3965056
dense_1 (Dense) (None, 84) 10836
dense_2 (Dense) (None, 42) 3570
dense_3 (Dense) (None, 2) 86
=================================================================
Total params: 4,402,780
Trainable params: 4,402,524
Non-trainable params: 256
_________________________________________________________________
batch_size = 64
nb_epochs = 20
history_3 = model_3.fit(X_train1, y_train1,
batch_size=batch_size,
epochs=nb_epochs,
validation_data=(X_val, y_val),
initial_epoch=0, callbacks=[EarlyStopping(monitor='val_loss', patience=3, min_delta=0.001)])
Epoch 1/20 60/60 [==============================] - 156s 3s/step - loss: 0.6527 - accuracy: 0.6594 - val_loss: 0.6083 - val_accuracy: 0.7031 Epoch 2/20 60/60 [==============================] - 160s 3s/step - loss: 0.5647 - accuracy: 0.7286 - val_loss: 0.5843 - val_accuracy: 0.7198 Epoch 3/20 60/60 [==============================] - 158s 3s/step - loss: 0.5501 - accuracy: 0.7260 - val_loss: 0.5555 - val_accuracy: 0.7385 Epoch 4/20 60/60 [==============================] - 158s 3s/step - loss: 0.5343 - accuracy: 0.7414 - val_loss: 0.5385 - val_accuracy: 0.7385 Epoch 5/20 60/60 [==============================] - 158s 3s/step - loss: 0.5088 - accuracy: 0.7625 - val_loss: 0.5368 - val_accuracy: 0.7375 Epoch 6/20 60/60 [==============================] - 158s 3s/step - loss: 0.4993 - accuracy: 0.7607 - val_loss: 0.5299 - val_accuracy: 0.7437 Epoch 7/20 60/60 [==============================] - 158s 3s/step - loss: 0.4814 - accuracy: 0.7734 - val_loss: 0.5503 - val_accuracy: 0.7281 Epoch 8/20 60/60 [==============================] - 159s 3s/step - loss: 0.4626 - accuracy: 0.7799 - val_loss: 0.5377 - val_accuracy: 0.7448 Epoch 9/20 60/60 [==============================] - 158s 3s/step - loss: 0.4446 - accuracy: 0.7969 - val_loss: 0.5479 - val_accuracy: 0.7260
plt.figure(figsize=(6,4))
plt.plot(history_3.history['loss'], label='train')
plt.plot(history_3.history['val_loss'], label='test')
plt.legend()
plt.show()
plt.figure(figsize=(6,4))
plt.plot(history_3.history['accuracy'], label='train')
plt.plot(history_3.history['val_accuracy'], label='test')
plt.legend()
plt.show()
plot_model(model,to_file="model_0_plot.png", show_shapes = True, show_layer_names = True)
plot_model(model_2,to_file="model_2_plot.png", show_shapes = True, show_layer_names = True)
plot_model(model_2a,to_file="model_2a_plot.png", show_shapes = True, show_layer_names = True)
plot_model(model_3,to_file="model_3_plot.png", show_shapes = True, show_layer_names = True)
# serialize model to JSON
model_json = model.to_json()
with open("model.json", "w") as json_file:
json_file.write(model_json)
# serialize weights to HDF5
model.save_weights("model.h5")
print("Saved model to disk")
Saved model to disk
# serialize model to JSON
model_json = model.to_json()
with open("model_2.json", "w") as json_file:
json_file.write(model_json)
# serialize weights to HDF5
model.save_weights("model_2.h5")
print("Saved model to disk")
Saved model to disk
# serialize model to JSON
model_json = model.to_json()
with open("model_2a.json", "w") as json_file:
json_file.write(model_json)
# serialize weights to HDF5
model.save_weights("model_2a.h5")
print("Saved model to disk")
Saved model to disk
# serialize model to JSON
model_json = model.to_json()
with open("model_3.json", "w") as json_file:
json_file.write(model_json)
# serialize weights to HDF5
model.save_weights("model_3.h5")
print("Saved model to disk")
Saved model to disk
# load json and create model
json_file = open('model.json', 'r')
loaded_model_json = json_file.read()
loaded_model = model_from_json(loaded_model_json)
# load weights into new model
loaded_model.load_weights("model.h5")
print("Loaded model from disk")
loaded_model.compile(loss='binary_crossentropy', optimizer='adam', metrics=['accuracy'])
score = loaded_model.evaluate(X_test, y_test, verbose=0)
print("%s: %.2f%%" % (loaded_model.metrics_names[1], score[1]*100))
Loaded model from disk accuracy: 71.25%
# load json and create model
json_file = open('model_2.json', 'r')
loaded_model_json = json_file.read()
json_file.close()
loaded_model = model_from_json(loaded_model_json)
# load weights into new model
loaded_model.load_weights("model_2.h5")
print("Loaded model from disk")
loaded_model.compile(loss='binary_crossentropy', optimizer='adam', metrics=['accuracy'])
score = loaded_model.evaluate(X_test, y_test, verbose=0)
print("%s: %.2f%%" % (loaded_model.metrics_names[1], score[1]*100))
Loaded model from disk accuracy: 71.25%
# load json and create model
json_file = open('model_2a.json', 'r')
loaded_model_json = json_file.read()
json_file.close()
loaded_model = model_from_json(loaded_model_json)
# load weights into new model
loaded_model.load_weights("model_2a.h5")
print("Loaded model from disk")
loaded_model.compile(loss='binary_crossentropy', optimizer='adam', metrics=['accuracy'])
score = loaded_model.evaluate(X_test, y_test, verbose=0)
print("%s: %.2f%%" % (loaded_model.metrics_names[1], score[1]*100))
Loaded model from disk accuracy: 71.25%
# load json and create model
json_file = open('model_3.json', 'r')
loaded_model_json = json_file.read()
json_file.close()
loaded_model = model_from_json(loaded_model_json)
# load weights into new model
loaded_model.load_weights("model_3.h5")
print("Loaded model from disk")
loaded_model.compile(loss='binary_crossentropy', optimizer='adam', metrics=['accuracy'])
score = loaded_model.evaluate(X_test, y_test, verbose=0)
print("%s: %.2f%%" % (loaded_model.metrics_names[1], score[1]*100))
Loaded model from disk accuracy: 71.25%
loss_1, accuracy_1 = model.evaluate(X_test,y_test)
38/38 [==============================] - 2s 47ms/step - loss: 0.6716 - accuracy: 0.7125
y_pred_1 = model.predict(X_test)
38/38 [==============================] - 2s 47ms/step
y_test_arg=np.argmax(y_test,axis=1)
Y_pred_1 = np.argmax(y_pred_1,axis=1)
RESULTS = pd.DataFrame()
RESULTS = RESULTS.append(pd.Series(["model",np.round((loss_1*100),2),np.round((accuracy_1*100),2)]),ignore_index=True)
cm = confusion_matrix(y_test_arg, Y_pred_1)
labels = ["0 - No Pneumonia", "1 - Pneumonia"]
disp = ConfusionMatrixDisplay(confusion_matrix=cm, display_labels=labels)
disp.plot()
plt.show()
print(classification_report(y_test_arg, Y_pred_1))
precision recall f1-score support
0 0.70 0.75 0.72 600
1 0.73 0.68 0.70 600
accuracy 0.71 1200
macro avg 0.71 0.71 0.71 1200
weighted avg 0.71 0.71 0.71 1200
loss_2, accuracy_2 = model_2.evaluate(X_test,y_test)
38/38 [==============================] - 2s 47ms/step - loss: 0.7090 - accuracy: 0.6817
y_pred_2 = model_2.predict(X_test)
38/38 [==============================] - 2s 47ms/step
Y_pred_2 = np.argmax(y_pred_2,axis=1)
RESULTS = RESULTS.append(pd.Series(["model_2",np.round((loss_2*100),2),np.round((accuracy_2*100),2)]),ignore_index=True)
cm = confusion_matrix(y_test_arg, Y_pred_2)
labels = ["0 - No Pneumonia", "1 - Pneumonia"]
disp = ConfusionMatrixDisplay(confusion_matrix=cm, display_labels=labels)
disp.plot()
plt.show()
print(classification_report(y_test_arg, Y_pred_2))
precision recall f1-score support
0 0.68 0.69 0.68 600
1 0.68 0.68 0.68 600
accuracy 0.68 1200
macro avg 0.68 0.68 0.68 1200
weighted avg 0.68 0.68 0.68 1200
loss_2a, accuracy_2a = model_2a.evaluate(X_test,y_test)
38/38 [==============================] - 8s 198ms/step - loss: 0.7499 - accuracy: 0.6775
y_pred_2a = model.predict(X_test)
38/38 [==============================] - 2s 47ms/step
Y_pred_2a = np.argmax(y_pred_2a,axis=1)
RESULTS = RESULTS.append(pd.Series(["model_2a",np.round((loss_2a*100),2),np.round((accuracy_2a*100),2)]),ignore_index=True)
cm = confusion_matrix(y_test_arg, Y_pred_2a)
labels = ["0 - No Pneumonia", "1 - Pneumonia"]
disp = ConfusionMatrixDisplay(confusion_matrix=cm, display_labels=labels)
disp.plot()
plt.show()
print(classification_report(y_test_arg, Y_pred_2a))
precision recall f1-score support
0 0.70 0.75 0.72 600
1 0.73 0.68 0.70 600
accuracy 0.71 1200
macro avg 0.71 0.71 0.71 1200
weighted avg 0.71 0.71 0.71 1200
loss_3, accuracy_3 = model_3.evaluate(X_test,y_test)
38/38 [==============================] - 7s 176ms/step - loss: 0.5918 - accuracy: 0.6958
y_pred_3 = model.predict(X_test)
38/38 [==============================] - 2s 50ms/step
Y_pred_3 = np.argmax(y_pred_3,axis=1)
RESULTS = RESULTS.append(pd.Series(["model_3",np.round((loss_3*100),2),np.round((accuracy_3*100),2)]),ignore_index=True)
cm = confusion_matrix(y_test_arg, Y_pred_3)
labels = ["0 - No Pneumonia", "1 - Pneumonia"]
disp = ConfusionMatrixDisplay(confusion_matrix=cm, display_labels=labels)
disp.plot()
plt.show()
print(classification_report(y_test_arg, Y_pred_3))
precision recall f1-score support
0 0.70 0.75 0.72 600
1 0.73 0.68 0.70 600
accuracy 0.71 1200
macro avg 0.71 0.71 0.71 1200
weighted avg 0.71 0.71 0.71 1200
RESULTS.columns = ['MODEL_NAME','LOSS','ACCURACY']
RESULTS.style.set_properties(**{'border': '1.3px solid black','color': 'blue'}).hide_index()
| MODEL_NAME | LOSS | ACCURACY |
|---|---|---|
| model | 67.160000 | 71.250000 |
| model_2 | 70.900000 | 68.170000 |
| model_2a | 74.990000 | 67.750000 |
| model_3 | 59.180000 | 69.580000 |
RESULTS.to_csv("base_model_results.csv", index = False)
* From all the models, we can see that model_3 has low loss and high accuracy, compared to other models.
print("X_test.shape",X_test.shape)
print("y_test.shape",y_test.shape)
X_test.shape (1200, 224, 224, 3) y_test.shape (1200, 2)
import random
for i in range(1,6):
rand_test = random.randint(0, 1200)
print("random test sample...:",i)
print("")
print(" random test number",rand_test)
print("")
# print(" X_test",X_test[rand_test])
print(" y_test",y_test[rand_test])
print(" y_pred_1",y_pred_1[rand_test])
print(" argmax",np.argmax(y_pred_1[rand_test]))
print(" y_pred_2",y_pred_2[rand_test])
print(" argmax",np.argmax(y_pred_2[rand_test]))
print(" y_pred_2a",y_pred_2a[rand_test])
print(" argmax",np.argmax(y_pred_2a[rand_test]))
print(" y_pred_3",y_pred_3[rand_test])
print(" argmax",np.argmax(y_pred_3[rand_test]))
print("")
random test sample...: 1
random test number 323
y_test [1. 0.]
y_pred_1 [0.9543633 0.01797388]
argmax 0
y_pred_2 [0.9403397 0.01519391]
argmax 0
y_pred_2a [0.9543633 0.01797388]
argmax 0
y_pred_3 [0.9543633 0.01797388]
argmax 0
random test sample...: 2
random test number 764
y_test [0. 1.]
y_pred_1 [0.91296834 0.00319313]
argmax 0
y_pred_2 [0.5173416 0.21540992]
argmax 0
y_pred_2a [0.91296834 0.00319313]
argmax 0
y_pred_3 [0.91296834 0.00319313]
argmax 0
random test sample...: 3
random test number 894
y_test [1. 0.]
y_pred_1 [0.9359591 0.00203102]
argmax 0
y_pred_2 [0.39366844 0.41483364]
argmax 1
y_pred_2a [0.9359591 0.00203102]
argmax 0
y_pred_3 [0.9359591 0.00203102]
argmax 0
random test sample...: 4
random test number 227
y_test [1. 0.]
y_pred_1 [0.904644 0.06956016]
argmax 0
y_pred_2 [0.7016851 0.21505149]
argmax 0
y_pred_2a [0.904644 0.06956016]
argmax 0
y_pred_3 [0.904644 0.06956016]
argmax 0
random test sample...: 5
random test number 272
y_test [0. 1.]
y_pred_1 [0.11444376 0.91253275]
argmax 1
y_pred_2 [0.7973255 0.13347106]
argmax 0
y_pred_2a [0.11444376 0.91253275]
argmax 1
y_pred_3 [0.11444376 0.91253275]
argmax 1
*** END OF INTERIM SUBMISSION ***
* Here we are increasing the number of images to 10000.
* By increasing the number of images we are expecting the models to be trained on more variations and more information is
made available to the model and models can learn more on the data.
- In our earlier steps before training the model, we have seggregated the images into 2 samples:
* sample1_a - Images with out pneumonia - 20672 Records
* sample2_a - Images with pneumonia - 6012 Records
- From the above samples, we select 5000 Records from each samples and concat the records which gives a final dataset of
10000 image records. Here we are increasing the data by 67%
print("No of image with No Pneumonia",len(Images_df[Images_df["Image_Target"] == 0]))
print("\nNo of Images with Pneumonia",len(Images_df[Images_df["Image_Target"] == 1]))
No of image with No Pneumonia 20672 No of Images with Pneumonia 6012
sample1_5000 = Images_sample1_a.sample(n = 5000)
sample2_5000 = Images_sample2_a.sample(n = 5000)
sams_5000 = [sample1_5000, sample2_5000]
result_5000 = pd.concat(sams_5000)
print('Shape of the Increased Training Data: ',result_5000.shape)
print('\nNo of Images in Target 0 and Target 1:\n',result_5000["Image_Target"].value_counts())
Shape of the Increased Training Data: (10000, 6) No of Images in Target 0 and Target 1: 0 5000 1 5000 Name: Image_Target, dtype: int64
Images_10000_Target = result_5000.copy()
Images_10000_Target.dtypes
Folder object Image_Class object Image_file object Full_filename object Image_Class_Category int32 Image_Target int64 dtype: object
img_rows=224
img_cols=224
dim = (img_rows, img_cols)
X_target = []
brk = 0
i = 1 # initialisation
for img in tqdm(Images_10000_Target["Full_filename"].values):
ds_3 = dicom.dcmread(img)
img_3 = ds_3.pixel_array
rgb = apply_color_lut(img_3, palette='PET')
train_img = rgb
try:
train_img_resize = cv2.resize(train_img, dim, interpolation=cv2.INTER_LINEAR)
except:
brk +=1
print("breaking out for",img_3)
break
height_2, width_2, layers = train_img_resize.shape
size=(width_2,height_2)
X_target.append(train_img_resize)
i += 1
100%|████████████████████████████████████████████████████████████████████████████| 10000/10000 [04:35<00:00, 36.24it/s]
np.array(X_target).shape
(10000, 224, 224, 3)
* Pickle the 10000 Sample images and proportionate targets.
fileName = "Images_X_3_target_10000.pkl"
fileObject = open(fileName, 'wb')
pkl.dump(X_target, fileObject)
fileObject.close()
Images_10000_Target.to_pickle("Images_10000_Target.pkl") # pickling the proportionate target from 10000 Images dataframe
image_X_3 = open ("Images_X_3_target_10000.pkl", "rb")
Images_10000_pkl= open ("Images_10000_Target.pkl", "rb")
X = pkl.load(image_X_3)
Images_10000_Target = pkl.load(Images_10000_pkl)
X[0:2]
[array([[[255, 245, 237],
[255, 238, 222],
[255, 230, 205],
...,
[255, 246, 239],
[255, 249, 245],
[255, 253, 252]],
[[255, 219, 184],
[255, 212, 171],
[255, 200, 145],
...,
[255, 221, 188],
[255, 226, 198],
[255, 242, 231]],
[[255, 194, 134],
[255, 188, 122],
[255, 174, 94],
...,
[255, 195, 134],
[255, 201, 147],
[255, 218, 182]],
...,
[[ 0, 40, 39],
[ 0, 40, 39],
[ 0, 40, 39],
...,
[ 0, 38, 37],
[ 0, 38, 37],
[ 0, 40, 39]],
[[ 0, 40, 39],
[ 0, 39, 38],
[ 0, 40, 39],
...,
[ 0, 38, 37],
[ 0, 38, 37],
[ 0, 40, 39]],
[[ 0, 40, 39],
[ 0, 39, 38],
[ 0, 40, 39],
...,
[ 0, 38, 37],
[ 0, 38, 37],
[ 0, 40, 39]]], dtype=uint8),
array([[[ 0, 48, 47],
[ 0, 47, 46],
[ 0, 44, 43],
...,
[ 0, 36, 35],
[ 0, 38, 37],
[ 0, 34, 33]],
[[ 0, 47, 45],
[ 0, 44, 43],
[ 0, 44, 43],
...,
[ 0, 33, 32],
[ 0, 33, 32],
[ 0, 33, 32]],
[[ 0, 44, 43],
[ 0, 43, 42],
[ 0, 40, 39],
...,
[ 0, 36, 35],
[ 0, 36, 35],
[ 0, 36, 35]],
...,
[[228, 101, 53],
[233, 106, 44],
[239, 112, 32],
...,
[ 0, 43, 43],
[ 0, 46, 45],
[ 0, 39, 38]],
[[233, 106, 44],
[231, 104, 47],
[240, 113, 30],
...,
[ 0, 40, 39],
[ 0, 48, 47],
[ 0, 40, 40]],
[[235, 108, 40],
[242, 115, 26],
[245, 119, 20],
...,
[ 0, 44, 43],
[ 0, 50, 49],
[ 0, 38, 37]]], dtype=uint8)]
X_target = X
np.array(X_target).shape
(10000, 224, 224, 3)
X_tar_10k = np.array(X_target)
X_tar_10k.shape
(10000, 224, 224, 3)
print(X_tar_10k.max())
print('\n',X_tar_10k.min())
255 0
y1_tar_10k = Images_10000_Target['Image_Target']
y1_tar_10k.value_counts()
0 5000 1 5000 Name: Image_Target, dtype: int64
y1_tar_cat_10k = to_categorical(y1_tar_10k, num_classes=2)
print("Shape of y1_tar_cat_10k:", y1_tar_cat_10k.shape)
Shape of y1_tar_cat_10k: (10000, 2)
y1_tar_cat_10k[0:10]
array([[1., 0.],
[1., 0.],
[1., 0.],
[1., 0.],
[1., 0.],
[1., 0.],
[1., 0.],
[1., 0.],
[1., 0.],
[1., 0.]], dtype=float32)
print("X_tar_10k shape",X_tar_10k.shape)
print("y1_tar_cat_10k shape",y1_tar_cat_10k.shape)
X_tar_10k shape (10000, 224, 224, 3) y1_tar_cat_10k shape (10000, 2)
* Split the data into test and validation:
X_train, X_test, y_train, y_test = train_test_split(X_tar_10k, y1_tar_cat_10k, test_size=.20,
stratify=y1_tar_cat_10k, random_state=1) # 80% Training and 20% Testing
print("Shape of X_train",X_train.shape)
print("\nShape of y_train",y_train.shape)
print("\nShape of X_test",X_test.shape)
print("\nShape of y_test",y_test.shape)
Shape of X_train (8000, 224, 224, 3) Shape of y_train (8000, 2) Shape of X_test (2000, 224, 224, 3) Shape of y_test (2000, 2)
X_train1, X_val, y_train1, y_val = train_test_split(X_train, y_train, test_size=.20, stratify=y_train, random_state=1) # 80% Training and 20% Testing
print("Shape of X_train1",X_train1.shape)
print("\nShape of y_train1",y_train1.shape)
print("\nShape of X_val",X_val.shape)
print("\nShape of y_val",y_val.shape)
Shape of X_train1 (6400, 224, 224, 3) Shape of y_train1 (6400, 2) Shape of X_val (1600, 224, 224, 3) Shape of y_val (1600, 2)
print(X_val[0])
print(y_val[0])
[[[145 20 217] [ 75 53 201] [ 8 119 133] ... [ 0 0 0] [ 0 0 0] [ 0 0 0]] [[212 85 86] [129 10 238] [ 47 80 173] ... [ 0 0 0] [ 0 0 0] [ 0 0 0]] [[237 110 35] [158 30 196] [106 20 233] ... [ 0 0 0] [ 0 0 0] [ 0 0 0]] ... [[171 43 170] [133 6 244] [ 90 37 217] ... [ 0 78 76] [ 0 81 79] [ 0 79 77]] [[255 154 51] [250 127 20] [226 99 58] ... [ 37 90 163] [ 38 88 164] [ 41 85 167]] [[255 219 184] [255 208 162] [255 207 160] ... [170 42 172] [172 44 167] [174 46 163]]] [1. 0.]
tf.keras.backend.clear_session()
batch_size = 64
nb_epochs = 10
history = model.fit(X_train1, y_train1,
batch_size=batch_size,
epochs=nb_epochs,
validation_data=(X_val, y_val),
initial_epoch=0, callbacks=[EarlyStopping(monitor='val_loss', patience=2, min_delta=0.001)])
Epoch 1/10 100/100 [==============================] - 53s 525ms/step - loss: 0.5658 - accuracy: 0.7228 - val_loss: 0.8308 - val_accuracy: 0.6550 Epoch 2/10 100/100 [==============================] - 53s 530ms/step - loss: 0.5360 - accuracy: 0.7403 - val_loss: 0.5387 - val_accuracy: 0.7281 Epoch 3/10 100/100 [==============================] - 53s 531ms/step - loss: 0.5084 - accuracy: 0.7550 - val_loss: 0.5603 - val_accuracy: 0.7206 Epoch 4/10 100/100 [==============================] - 53s 527ms/step - loss: 0.4901 - accuracy: 0.7652 - val_loss: 0.5544 - val_accuracy: 0.7275
plt.figure(figsize=(6,4))
plt.plot(history.history['loss'], label='train')
plt.plot(history.history['val_loss'], label='test')
plt.legend()
plt.show()
plt.figure(figsize=(6,4))
plt.plot(history.history['accuracy'], label='train')
plt.plot(history.history['val_accuracy'], label='test')
plt.legend()
plt.show()
tf.keras.backend.clear_session()
batch_size = 64
nb_epochs = 10
history_2 = model_2.fit(X_train1, y_train1,
batch_size=batch_size,
epochs=nb_epochs,
validation_data=(X_val, y_val),
initial_epoch=0, callbacks=[EarlyStopping(monitor='val_loss', patience=3, min_delta=0.0001)])
Epoch 1/10 100/100 [==============================] - 53s 533ms/step - loss: 0.5336 - accuracy: 0.7456 - val_loss: 0.5606 - val_accuracy: 0.7287 Epoch 2/10 100/100 [==============================] - 53s 533ms/step - loss: 0.4929 - accuracy: 0.7686 - val_loss: 0.5542 - val_accuracy: 0.7244 Epoch 3/10 100/100 [==============================] - 53s 532ms/step - loss: 0.4625 - accuracy: 0.7853 - val_loss: 0.5424 - val_accuracy: 0.7431 Epoch 4/10 100/100 [==============================] - 54s 541ms/step - loss: 0.4185 - accuracy: 0.8103 - val_loss: 0.5951 - val_accuracy: 0.7069 Epoch 5/10 100/100 [==============================] - 54s 536ms/step - loss: 0.3710 - accuracy: 0.8384 - val_loss: 0.5894 - val_accuracy: 0.7431 Epoch 6/10 100/100 [==============================] - 54s 542ms/step - loss: 0.3137 - accuracy: 0.8622 - val_loss: 0.7019 - val_accuracy: 0.6675
plt.figure(figsize=(6,4))
plt.plot(history_2.history['loss'], label='train')
plt.plot(history_2.history['val_loss'], label='test')
plt.legend()
plt.show()
plt.figure(figsize=(6,4))
plt.plot(history_2.history['accuracy'], label='train')
plt.plot(history_2.history['val_accuracy'], label='test')
plt.legend()
plt.show()
tf.keras.backend.clear_session()
batch_size = 64
nb_epochs = 10
history_2a = model_2a.fit(X_train1, y_train1,
batch_size=batch_size,
epochs=nb_epochs,
validation_data=(X_val, y_val),
initial_epoch=0, callbacks=[EarlyStopping(monitor='val_loss', patience=3, min_delta=0.01)])
Epoch 1/10 100/100 [==============================] - 281s 3s/step - loss: 0.5493 - accuracy: 0.7348 - val_loss: 0.5557 - val_accuracy: 0.7294 Epoch 2/10 100/100 [==============================] - 326s 3s/step - loss: 0.4865 - accuracy: 0.7680 - val_loss: 0.5466 - val_accuracy: 0.7613 Epoch 3/10 100/100 [==============================] - 323s 3s/step - loss: 0.4405 - accuracy: 0.7980 - val_loss: 0.5561 - val_accuracy: 0.7287 Epoch 4/10 100/100 [==============================] - 325s 3s/step - loss: 0.3928 - accuracy: 0.8253 - val_loss: 0.5607 - val_accuracy: 0.7450
plt.figure(figsize=(6,4))
plt.plot(history_2a.history['loss'], label='train')
plt.plot(history_2a.history['val_loss'], label='test')
plt.legend()
plt.show()
plt.figure(figsize=(6,4))
plt.plot(history_2a.history['accuracy'], label='train')
plt.plot(history_2a.history['val_accuracy'], label='test')
plt.legend()
plt.show()
tf.keras.backend.clear_session()
batch_size = 64
nb_epochs = 10
history_3 = model_3.fit(X_train1, y_train1,
batch_size=batch_size,
epochs=nb_epochs,
validation_data=(X_val, y_val),
initial_epoch=0, callbacks=[EarlyStopping(monitor='val_loss', patience=3, min_delta=0.001)])
Epoch 1/10 100/100 [==============================] - 301s 3s/step - loss: 0.5468 - accuracy: 0.7325 - val_loss: 0.5153 - val_accuracy: 0.7525 Epoch 2/10 100/100 [==============================] - 299s 3s/step - loss: 0.5152 - accuracy: 0.7533 - val_loss: 0.5256 - val_accuracy: 0.7362 Epoch 3/10 100/100 [==============================] - 298s 3s/step - loss: 0.5022 - accuracy: 0.7570 - val_loss: 0.5226 - val_accuracy: 0.7350 Epoch 4/10 100/100 [==============================] - 297s 3s/step - loss: 0.4877 - accuracy: 0.7659 - val_loss: 0.5242 - val_accuracy: 0.7381
plt.figure(figsize=(6,4))
plt.plot(history_3.history['loss'], label='train')
plt.plot(history_3.history['val_loss'], label='test')
plt.legend()
plt.show()
plt.figure(figsize=(6,4))
plt.plot(history_3.history['accuracy'], label='train')
plt.plot(history_3.history['val_accuracy'], label='test')
plt.legend()
plt.show()
loss_1, accuracy_1 = model.evaluate(X_test,y_test)
63/63 [==============================] - 5s 79ms/step - loss: 0.5370 - accuracy: 0.7410
y_pred_1 = model.predict(X_test)
63/63 [==============================] - 5s 77ms/step
y_test_arg=np.argmax(y_test,axis=1)
Y_pred_1 = np.argmax(y_pred_1,axis=1)
RESULTS = pd.DataFrame()
RESULTS = RESULTS.append(pd.Series(["model",np.round((loss_1*100),2),np.round((accuracy_1*100),2)]),ignore_index=True)
cm = confusion_matrix(y_test_arg, Y_pred_1)
labels = ["0 - No Pneumonia", "1 - Pneumonia"]
disp = ConfusionMatrixDisplay(confusion_matrix=cm, display_labels=labels)
disp.plot()
plt.show()
loss_2, accuracy_2 = model_2.evaluate(X_test,y_test)
63/63 [==============================] - 5s 83ms/step - loss: 0.7691 - accuracy: 0.6565
y_pred_2 = model_2.predict(X_test)
63/63 [==============================] - 5s 82ms/step
Y_pred_2 = np.argmax(y_pred_2,axis=1)
RESULTS = RESULTS.append(pd.Series(["model_2",np.round((loss_2*100),2),np.round((accuracy_2*100),2)]),ignore_index=True)
cm = confusion_matrix(y_test_arg, Y_pred_2)
labels = ["0 - No Pneumonia", "1 - Pneumonia"]
disp = ConfusionMatrixDisplay(confusion_matrix=cm, display_labels=labels)
disp.plot()
plt.show()
print(classification_report(y_test_arg, Y_pred_2))
precision recall f1-score support
0 0.62 0.82 0.70 1000
1 0.73 0.49 0.59 1000
accuracy 0.66 2000
macro avg 0.68 0.66 0.65 2000
weighted avg 0.68 0.66 0.65 2000
loss_2a, accuracy_2a = model_2a.evaluate(X_test,y_test)
63/63 [==============================] - 19s 304ms/step - loss: 0.5580 - accuracy: 0.7450
y_pred_2a = model.predict(X_test)
63/63 [==============================] - 5s 76ms/step
Y_pred_2a = np.argmax(y_pred_2a,axis=1)
RESULTS = RESULTS.append(pd.Series(["model_2a",np.round((loss_2a*100),2),np.round((accuracy_2a*100),2)]),ignore_index=True)
cm = confusion_matrix(y_test_arg, Y_pred_2a)
labels = ["0 - No Pneumonia", "1 - Pneumonia"]
disp = ConfusionMatrixDisplay(confusion_matrix=cm, display_labels=labels)
disp.plot()
plt.show()
print(classification_report(y_test_arg, Y_pred_2a))
precision recall f1-score support
0 0.71 0.81 0.76 1000
1 0.78 0.67 0.72 1000
accuracy 0.74 2000
macro avg 0.75 0.74 0.74 2000
weighted avg 0.75 0.74 0.74 2000
loss_3, accuracy_3 = model_3.evaluate(X_test,y_test)
63/63 [==============================] - 18s 277ms/step - loss: 0.5284 - accuracy: 0.7295
y_pred_3 = model.predict(X_test)
63/63 [==============================] - 5s 77ms/step
Y_pred_3 = np.argmax(y_pred_3,axis=1)
RESULTS = RESULTS.append(pd.Series(["model_3",np.round((loss_3*100),2),np.round((accuracy_3*100),2)]),ignore_index=True)
cm = confusion_matrix(y_test_arg, Y_pred_3)
labels = ["0 - No Pneumonia", "1 - Pneumonia"]
disp = ConfusionMatrixDisplay(confusion_matrix=cm, display_labels=labels)
disp.plot()
plt.show()
print(classification_report(y_test_arg, Y_pred_3))
precision recall f1-score support
0 0.71 0.81 0.76 1000
1 0.78 0.67 0.72 1000
accuracy 0.74 2000
macro avg 0.75 0.74 0.74 2000
weighted avg 0.75 0.74 0.74 2000
RESULTS.columns = ['MODEL_NAME','LOSS','ACCURACY']
RESULTS.style.set_properties(**{'border': '1.3px solid black','color': 'blue'}).hide_index()
| MODEL_NAME | LOSS | ACCURACY |
|---|---|---|
| model | 53.700000 | 74.100000 |
| model_2 | 76.910000 | 65.650000 |
| model_2a | 55.800000 | 74.500000 |
| model_3 | 52.840000 | 72.950000 |
RESULTS.to_csv("FINE_TUNING_10000_IMAGES.csv",index = False)
Observations:
1. On increasing the images to 10000, model_2a increased 10% of accuracy against 67% of additional images added.
2. If we see compare all the models, overall there is no much improvement in accuracies in other models.
3. In this case to keed the memory allocations in check while training the models, we use only 6000 images to make
the models train faster on further fine tuning methods.
# Convert the dicom images to jpeg format.
dst = "Training_jpg"
if os.path.exists(dst):
print("Path already exists")
else:
os.mkdir(dst)
jpeg_path = os.path.join(dst+"/")
folder_path = "stage_2_train_images"
dicom_path = os.listdir("stage_2_train_images")
NAMES_DCM = []
NAMES_JPG = []
for n, image in enumerate(dicom_path):
ds = dicom.dcmread(os.path.join(folder_path, image))
NAMES_DCM.append(image)
new_image = ds.pixel_array.astype(float)
scaled_image = (np.maximum(new_image, 0) / new_image.max()) * 255.0
scaled_image = np.uint8(scaled_image)
final_image = Image.fromarray(scaled_image)
final_image.save(jpeg_path + image.replace('.dcm','.jpg'))
NAMES_JPG.append(image.replace('.dcm','.jpg'))
Path already exists
# Convert the dicom images to jpeg format.
dst = "Training_jpg"
if os.path.exists(dst):
print("Path already exists")
else:
os.mkdir(dst)
jpeg_path = os.path.join(dst+"/")
folder_path = "stage_2_train_images"
dicom_path = os.listdir("stage_2_train_images")
NAMES_DCM = []
NAMES_JPG = []
for image in tqdm(Images_6000_Target["Image_file"].values):
ds = dicom.dcmread(os.path.join(folder_path, image))
NAMES_DCM.append(image)
new_image = ds.pixel_array.astype(float)
scaled_image = (np.maximum(new_image, 0) / new_image.max()) * 255.0
scaled_image = np.uint8(scaled_image)
final_image = Image.fromarray(scaled_image)
final_image.save(jpeg_path + image.replace('.dcm','.jpg'))
NAMES_JPG.append(image.replace('.dcm','.jpg'))
Path already exists
100%|██████████████████████████████████████████████████████████████████████████████| 6000/6000 [03:20<00:00, 30.00it/s]
NAMES_JPG = pd.Series(NAMES_JPG)
NAMES_DCM = pd.Series(NAMES_DCM)
NAMES_DF = {"NAMES_JPG" : NAMES_JPG, "NAMES_DCM" : NAMES_DCM}
NAMES_DF = pd.DataFrame(NAMES_DF)
NAMES_DF_FINAL = pd.merge(NAMES_DF, Images_df, left_on='NAMES_DCM', right_on='Image_file')
NAMES_DF_FINAL = NAMES_DF_FINAL.drop(columns = ["Folder","Image_file","Full_filename"])
NAMES_DF_FINAL["Full_filename"] = "Training_jpg"+"/"+NAMES_DF_FINAL["NAMES_JPG"]
NAMES_DF_FINAL.sample(20).style.set_properties(**{'border': '1.3px solid black','color': 'blue'}).hide_index()
| NAMES_JPG | NAMES_DCM | Image_Class | Image_Class_Category | Image_Target | Full_filename |
|---|---|---|---|---|---|
| bc03198b-57f4-47b5-8808-114fd5cb9808.jpg | bc03198b-57f4-47b5-8808-114fd5cb9808.dcm | Lung Opacity | 0 | 1 | Training_jpg/bc03198b-57f4-47b5-8808-114fd5cb9808.jpg |
| db5c4b38-0c83-4f5b-a159-ea4c3e1bc7a8.jpg | db5c4b38-0c83-4f5b-a159-ea4c3e1bc7a8.dcm | Not Normal | 2 | 0 | Training_jpg/db5c4b38-0c83-4f5b-a159-ea4c3e1bc7a8.jpg |
| e94d8646-1703-44db-9062-c9b305e238f0.jpg | e94d8646-1703-44db-9062-c9b305e238f0.dcm | Lung Opacity | 0 | 1 | Training_jpg/e94d8646-1703-44db-9062-c9b305e238f0.jpg |
| e63932eb-033b-46f1-aa55-6e913ccbe03a.jpg | e63932eb-033b-46f1-aa55-6e913ccbe03a.dcm | Not Normal | 2 | 0 | Training_jpg/e63932eb-033b-46f1-aa55-6e913ccbe03a.jpg |
| 7adb12c0-b1b2-402e-9c58-5a8873eef61d.jpg | 7adb12c0-b1b2-402e-9c58-5a8873eef61d.dcm | Lung Opacity | 0 | 1 | Training_jpg/7adb12c0-b1b2-402e-9c58-5a8873eef61d.jpg |
| 3e881f10-28aa-4626-a79a-50cc014b7a1a.jpg | 3e881f10-28aa-4626-a79a-50cc014b7a1a.dcm | Lung Opacity | 0 | 1 | Training_jpg/3e881f10-28aa-4626-a79a-50cc014b7a1a.jpg |
| 8f719c99-d7f6-4e27-a5f6-3a259dc60b61.jpg | 8f719c99-d7f6-4e27-a5f6-3a259dc60b61.dcm | Normal | 1 | 0 | Training_jpg/8f719c99-d7f6-4e27-a5f6-3a259dc60b61.jpg |
| a99561a5-2c12-48cd-be57-ce6b6cf58da1.jpg | a99561a5-2c12-48cd-be57-ce6b6cf58da1.dcm | Lung Opacity | 0 | 1 | Training_jpg/a99561a5-2c12-48cd-be57-ce6b6cf58da1.jpg |
| 01fe92f7-ff87-4f9e-9077-e00e670d1b47.jpg | 01fe92f7-ff87-4f9e-9077-e00e670d1b47.dcm | Normal | 1 | 0 | Training_jpg/01fe92f7-ff87-4f9e-9077-e00e670d1b47.jpg |
| 091d75a8-e23e-4cb8-af67-10cb314077ec.jpg | 091d75a8-e23e-4cb8-af67-10cb314077ec.dcm | Lung Opacity | 0 | 1 | Training_jpg/091d75a8-e23e-4cb8-af67-10cb314077ec.jpg |
| d60394b5-1f55-4929-a054-abd1eaa5d088.jpg | d60394b5-1f55-4929-a054-abd1eaa5d088.dcm | Normal | 1 | 0 | Training_jpg/d60394b5-1f55-4929-a054-abd1eaa5d088.jpg |
| a512e598-45c5-4f16-b0b0-63f05f6afb14.jpg | a512e598-45c5-4f16-b0b0-63f05f6afb14.dcm | Not Normal | 2 | 0 | Training_jpg/a512e598-45c5-4f16-b0b0-63f05f6afb14.jpg |
| c1c3ec5d-20ba-42f7-91f9-48032d97ffc9.jpg | c1c3ec5d-20ba-42f7-91f9-48032d97ffc9.dcm | Normal | 1 | 0 | Training_jpg/c1c3ec5d-20ba-42f7-91f9-48032d97ffc9.jpg |
| 34f16115-b927-44eb-9880-b413b65421fa.jpg | 34f16115-b927-44eb-9880-b413b65421fa.dcm | Lung Opacity | 0 | 1 | Training_jpg/34f16115-b927-44eb-9880-b413b65421fa.jpg |
| 20fa542a-4930-46eb-8e1b-3e938980e6ea.jpg | 20fa542a-4930-46eb-8e1b-3e938980e6ea.dcm | Lung Opacity | 0 | 1 | Training_jpg/20fa542a-4930-46eb-8e1b-3e938980e6ea.jpg |
| e660384d-3cd7-4a42-bfa3-c64552f55e33.jpg | e660384d-3cd7-4a42-bfa3-c64552f55e33.dcm | Not Normal | 2 | 0 | Training_jpg/e660384d-3cd7-4a42-bfa3-c64552f55e33.jpg |
| 3abb7176-035d-46cc-844e-820870e8154b.jpg | 3abb7176-035d-46cc-844e-820870e8154b.dcm | Lung Opacity | 0 | 1 | Training_jpg/3abb7176-035d-46cc-844e-820870e8154b.jpg |
| 05b2d676-777a-4a4d-82ab-bf3f5f9b4d60.jpg | 05b2d676-777a-4a4d-82ab-bf3f5f9b4d60.dcm | Lung Opacity | 0 | 1 | Training_jpg/05b2d676-777a-4a4d-82ab-bf3f5f9b4d60.jpg |
| 8176c9ad-ffde-4b9c-9050-b5b725ed4c0d.jpg | 8176c9ad-ffde-4b9c-9050-b5b725ed4c0d.dcm | Normal | 1 | 0 | Training_jpg/8176c9ad-ffde-4b9c-9050-b5b725ed4c0d.jpg |
| 8c98c399-dc7d-4466-a9b4-fe017b51016e.jpg | 8c98c399-dc7d-4466-a9b4-fe017b51016e.dcm | Normal | 1 | 0 | Training_jpg/8c98c399-dc7d-4466-a9b4-fe017b51016e.jpg |
NAMES_DF_FINAL.to_pickle("NAMES_DF_FINAL.pkl") # pickling the jpg image data
load_data = open("NAMES_DF_FINAL.pkl", "rb")
NAMES_DF_FINAL = pickle.load(load_data)
def show_jpg_images(passed_value):
show_df = NAMES_DF_FINAL.sample(n=passed_value)
if passed_value/5 == 0:
z = math.ceil(passed_value/5)
else:
z = math.ceil((passed_value)/5)
fig, ax = plt.subplots(z,5,constrained_layout=True)
fig.set_figheight(20)
fig.set_figwidth(20)
i = 0
j = 0
for index, row in show_df.iterrows():
image_path = row['Full_filename']
jpg_image = cv2.imread(image_path)
jpg_image = cv2.resize(jpg_image, dsize=(224,224))
height_1 = 224
width_1 = 224
dim = (width_1, height_1)
ax[i][j].imshow(jpg_image)
ax[i][j].set_title(row['Image_Class'])
j = j+1
if j == 5:
i = i+1
j = 0
plt.figure().clear()
plt.show()
return(show_df)
show_jpg_images(20)
<Figure size 640x480 with 0 Axes>
| NAMES_JPG | NAMES_DCM | Image_Class | Image_Class_Category | Image_Target | Full_filename | |
|---|---|---|---|---|---|---|
| 5075 | b8328cd5-8b9d-4834-aa79-053be9640655.jpg | b8328cd5-8b9d-4834-aa79-053be9640655.dcm | Lung Opacity | 0 | 1 | Training_jpg/b8328cd5-8b9d-4834-aa79-053be9640655.jpg |
| 1960 | 468feb51-e690-4ba0-967d-62dec940b671.jpg | 468feb51-e690-4ba0-967d-62dec940b671.dcm | Not Normal | 2 | 0 | Training_jpg/468feb51-e690-4ba0-967d-62dec940b671.jpg |
| 2978 | 9a7bcfc0-6ccf-4d8f-9b02-ad56448761c5.jpg | 9a7bcfc0-6ccf-4d8f-9b02-ad56448761c5.dcm | Not Normal | 2 | 0 | Training_jpg/9a7bcfc0-6ccf-4d8f-9b02-ad56448761c5.jpg |
| 1726 | c4542463-c7e3-4af7-a6d2-bb6cf6388012.jpg | c4542463-c7e3-4af7-a6d2-bb6cf6388012.dcm | Not Normal | 2 | 0 | Training_jpg/c4542463-c7e3-4af7-a6d2-bb6cf6388012.jpg |
| 5407 | 57bffc6f-1656-45a2-9e49-e28d1dcbeee1.jpg | 57bffc6f-1656-45a2-9e49-e28d1dcbeee1.dcm | Lung Opacity | 0 | 1 | Training_jpg/57bffc6f-1656-45a2-9e49-e28d1dcbeee1.jpg |
| 5272 | d5e9a92d-fb2f-416b-8731-4f7435d6c9c5.jpg | d5e9a92d-fb2f-416b-8731-4f7435d6c9c5.dcm | Lung Opacity | 0 | 1 | Training_jpg/d5e9a92d-fb2f-416b-8731-4f7435d6c9c5.jpg |
| 724 | 9aeac4e0-68d4-4214-9e22-72facf654789.jpg | 9aeac4e0-68d4-4214-9e22-72facf654789.dcm | Not Normal | 2 | 0 | Training_jpg/9aeac4e0-68d4-4214-9e22-72facf654789.jpg |
| 5247 | df21b6dd-32b7-4edd-8b7b-fa1a35e0c043.jpg | df21b6dd-32b7-4edd-8b7b-fa1a35e0c043.dcm | Lung Opacity | 0 | 1 | Training_jpg/df21b6dd-32b7-4edd-8b7b-fa1a35e0c043.jpg |
| 4998 | b7136f3f-41da-4b16-b7c3-4b929d81e6a5.jpg | b7136f3f-41da-4b16-b7c3-4b929d81e6a5.dcm | Lung Opacity | 0 | 1 | Training_jpg/b7136f3f-41da-4b16-b7c3-4b929d81e6a5.jpg |
| 4789 | 348699f9-4eb2-42b7-b9e8-8bc3230b32a5.jpg | 348699f9-4eb2-42b7-b9e8-8bc3230b32a5.dcm | Lung Opacity | 0 | 1 | Training_jpg/348699f9-4eb2-42b7-b9e8-8bc3230b32a5.jpg |
| 1404 | 89ff3f4d-2631-42d0-8d82-8017c4dcce7c.jpg | 89ff3f4d-2631-42d0-8d82-8017c4dcce7c.dcm | Not Normal | 2 | 0 | Training_jpg/89ff3f4d-2631-42d0-8d82-8017c4dcce7c.jpg |
| 5063 | ac18858f-cb0a-4d54-89a0-25a8c2bc0980.jpg | ac18858f-cb0a-4d54-89a0-25a8c2bc0980.dcm | Lung Opacity | 0 | 1 | Training_jpg/ac18858f-cb0a-4d54-89a0-25a8c2bc0980.jpg |
| 486 | 95ef1bd3-c207-4d70-88fd-c37c275086bb.jpg | 95ef1bd3-c207-4d70-88fd-c37c275086bb.dcm | Normal | 1 | 0 | Training_jpg/95ef1bd3-c207-4d70-88fd-c37c275086bb.jpg |
| 3464 | 0b0073e0-9c19-4da4-88f2-b3d2ebc6f1f4.jpg | 0b0073e0-9c19-4da4-88f2-b3d2ebc6f1f4.dcm | Lung Opacity | 0 | 1 | Training_jpg/0b0073e0-9c19-4da4-88f2-b3d2ebc6f1f4.jpg |
| 3386 | 722ad9c3-919a-4c08-bb67-cafd603ba754.jpg | 722ad9c3-919a-4c08-bb67-cafd603ba754.dcm | Lung Opacity | 0 | 1 | Training_jpg/722ad9c3-919a-4c08-bb67-cafd603ba754.jpg |
| 5640 | 9cde0ff6-095f-4ac6-88a8-c9ffc4676230.jpg | 9cde0ff6-095f-4ac6-88a8-c9ffc4676230.dcm | Lung Opacity | 0 | 1 | Training_jpg/9cde0ff6-095f-4ac6-88a8-c9ffc4676230.jpg |
| 329 | d0654ae8-b032-44b4-93e1-399dfd5dde26.jpg | d0654ae8-b032-44b4-93e1-399dfd5dde26.dcm | Not Normal | 2 | 0 | Training_jpg/d0654ae8-b032-44b4-93e1-399dfd5dde26.jpg |
| 5822 | aec75bea-7a4a-4ca6-8e82-bf3f05963c9d.jpg | aec75bea-7a4a-4ca6-8e82-bf3f05963c9d.dcm | Lung Opacity | 0 | 1 | Training_jpg/aec75bea-7a4a-4ca6-8e82-bf3f05963c9d.jpg |
| 5778 | b0604ee6-5f21-4b14-98ca-3bedaccd4e10.jpg | b0604ee6-5f21-4b14-98ca-3bedaccd4e10.dcm | Lung Opacity | 0 | 1 | Training_jpg/b0604ee6-5f21-4b14-98ca-3bedaccd4e10.jpg |
| 3672 | f9538c72-0f7b-45e3-98a8-de3e4434fe4d.jpg | f9538c72-0f7b-45e3-98a8-de3e4434fe4d.dcm | Lung Opacity | 0 | 1 | Training_jpg/f9538c72-0f7b-45e3-98a8-de3e4434fe4d.jpg |
sample_1_jpg = NAMES_DF_FINAL[NAMES_DF_FINAL["Image_Target"] == 0].sample(n = 3000)
sample_2_jpg = NAMES_DF_FINAL[NAMES_DF_FINAL["Image_Target"] == 1].sample(n = 3000)
samples_jpg = [sample_1_jpg,sample_2_jpg]
result_jpg = pd.concat(samples_jpg)
print("Shape of the data selected for model building: ", result_jpg.shape)
print("\nNo.of rows of data selected for target 0 and 1:\n",result_jpg["Image_Target"].value_counts())
Shape of the data selected for model building: (6000, 6) No.of rows of data selected for target 0 and 1: 0 3000 1 3000 Name: Image_Target, dtype: int64
images_6000_jpg = result_jpg.copy()
height = 224
width = 224
dim = (height, width)
X_target = []
brk = 0
i = 1 # initialisation
for img in tqdm(images_6000_jpg["Full_filename"].values):
try:
train_img_resize = cv2.resize(cv2.imread(img),dim, interpolation=cv2.INTER_LINEAR)
except:
brk+=1
print("breaking out for image ",img)
height_2, width_2, layers = train_img_resize.shape
size=(width_2,height_2)
X_target.append(train_img_resize)
i += 1
100%|█████████████████████████████████████████████████████████████████████████████| 6000/6000 [00:43<00:00, 139.11it/s]
X_target = np.asarray(X_target)
fileName = "Images_jpg_target_6000.pkl"
fileObject = open(fileName, 'wb')
pkl.dump(X_target, fileObject)
fileObject.close()
images_6000_jpg.to_pickle("images_6000_jpg.pkl") # pickling the proportionate target from 6000 jpg Images dataframe
image_pickle = open ("images_6000_jpg.pkl", "rb")
images_6000_jpg = pkl.load(image_pickle)
images_6000_jpg.sample(10).style.set_properties(**{'border': '1.3px solid black','color': 'blue'})
| NAMES_JPG | NAMES_DCM | Image_Class | Image_Class_Category | Image_Target | Full_filename | |
|---|---|---|---|---|---|---|
| 3557 | 6a41e296-8bda-4f84-8766-c253246b77d0.jpg | 6a41e296-8bda-4f84-8766-c253246b77d0.dcm | Lung Opacity | 0 | 1 | Training_jpg/6a41e296-8bda-4f84-8766-c253246b77d0.jpg |
| 3350 | bc234c2b-96aa-4263-931b-b7420c235daa.jpg | bc234c2b-96aa-4263-931b-b7420c235daa.dcm | Lung Opacity | 0 | 1 | Training_jpg/bc234c2b-96aa-4263-931b-b7420c235daa.jpg |
| 5344 | 36cf3250-2ea6-4ef9-8143-c1bbafe83475.jpg | 36cf3250-2ea6-4ef9-8143-c1bbafe83475.dcm | Lung Opacity | 0 | 1 | Training_jpg/36cf3250-2ea6-4ef9-8143-c1bbafe83475.jpg |
| 3575 | c3670fac-87af-4e1b-a1d4-3181427a60fc.jpg | c3670fac-87af-4e1b-a1d4-3181427a60fc.dcm | Lung Opacity | 0 | 1 | Training_jpg/c3670fac-87af-4e1b-a1d4-3181427a60fc.jpg |
| 2021 | fe4d9dea-1e2e-4ef4-886d-bf9a668b7850.jpg | fe4d9dea-1e2e-4ef4-886d-bf9a668b7850.dcm | Not Normal | 2 | 0 | Training_jpg/fe4d9dea-1e2e-4ef4-886d-bf9a668b7850.jpg |
| 64 | d5dec418-0f74-489f-b696-ce73e6016e40.jpg | d5dec418-0f74-489f-b696-ce73e6016e40.dcm | Normal | 1 | 0 | Training_jpg/d5dec418-0f74-489f-b696-ce73e6016e40.jpg |
| 4488 | ca6ed544-2804-4f23-97a6-5a9f4814e7c1.jpg | ca6ed544-2804-4f23-97a6-5a9f4814e7c1.dcm | Lung Opacity | 0 | 1 | Training_jpg/ca6ed544-2804-4f23-97a6-5a9f4814e7c1.jpg |
| 2316 | 6e9329fc-862a-47dc-848f-fb560ed17096.jpg | 6e9329fc-862a-47dc-848f-fb560ed17096.dcm | Not Normal | 2 | 0 | Training_jpg/6e9329fc-862a-47dc-848f-fb560ed17096.jpg |
| 3255 | add89c20-44cf-461a-9857-c41d753d4062.jpg | add89c20-44cf-461a-9857-c41d753d4062.dcm | Lung Opacity | 0 | 1 | Training_jpg/add89c20-44cf-461a-9857-c41d753d4062.jpg |
| 1351 | e5ffd138-39af-4e75-a79f-3531f2ff1b9e.jpg | e5ffd138-39af-4e75-a79f-3531f2ff1b9e.dcm | Normal | 1 | 0 | Training_jpg/e5ffd138-39af-4e75-a79f-3531f2ff1b9e.jpg |
image_jpg = open ("Images_jpg_target_6000.pkl", "rb")
X = pkl.load(image_jpg)
X[0:2]
array([[[[ 0, 0, 0],
[ 0, 0, 0],
[ 0, 0, 0],
...,
[ 0, 0, 0],
[ 0, 0, 0],
[ 0, 0, 0]],
[[ 0, 0, 0],
[ 0, 0, 0],
[ 0, 0, 0],
...,
[ 0, 0, 0],
[ 0, 0, 0],
[ 0, 0, 0]],
[[ 0, 0, 0],
[ 0, 0, 0],
[ 0, 0, 0],
...,
[ 0, 0, 0],
[ 0, 0, 0],
[ 0, 0, 0]],
...,
[[ 0, 0, 0],
[ 0, 0, 0],
[ 0, 0, 0],
...,
[ 0, 0, 0],
[ 0, 0, 0],
[ 0, 0, 0]],
[[ 0, 0, 0],
[ 0, 0, 0],
[ 0, 0, 0],
...,
[ 0, 0, 0],
[ 0, 0, 0],
[ 0, 0, 0]],
[[ 0, 0, 0],
[ 0, 0, 0],
[ 0, 0, 0],
...,
[ 0, 0, 0],
[ 0, 0, 0],
[ 0, 0, 0]]],
[[[218, 218, 218],
[213, 213, 213],
[206, 206, 206],
...,
[ 16, 16, 16],
[ 16, 16, 16],
[ 14, 14, 14]],
[[207, 207, 207],
[203, 203, 203],
[194, 194, 194],
...,
[ 16, 16, 16],
[ 16, 16, 16],
[ 14, 14, 14]],
[[200, 200, 200],
[192, 192, 192],
[183, 183, 183],
...,
[ 16, 16, 16],
[ 16, 16, 16],
[ 14, 14, 14]],
...,
[[223, 223, 223],
[217, 217, 217],
[200, 200, 200],
...,
[ 16, 16, 16],
[ 16, 16, 16],
[ 14, 14, 14]],
[[226, 226, 226],
[221, 221, 221],
[204, 204, 204],
...,
[ 17, 17, 17],
[ 17, 17, 17],
[ 17, 17, 17]],
[[229, 229, 229],
[224, 224, 224],
[207, 207, 207],
...,
[ 19, 19, 19],
[ 20, 20, 20],
[ 20, 20, 20]]]], dtype=uint8)
X_tar = np.array(X)
X_tar.shape
(6000, 224, 224, 3)
print(X_tar.max())
print(X_tar.min())
255 0
y1_tar = images_6000_jpg['Image_Target']
y1_tar.value_counts()
0 3000 1 3000 Name: Image_Target, dtype: int64
y1_tar_cat = to_categorical(y1_tar, num_classes=2)
print("Shape of y1_tar_cat:", y1_tar_cat.shape)
Shape of y1_tar_cat: (6000, 2)
y1_tar_cat[0:10]
array([[1., 0.],
[1., 0.],
[1., 0.],
[1., 0.],
[1., 0.],
[1., 0.],
[1., 0.],
[1., 0.],
[1., 0.],
[1., 0.]], dtype=float32)
print("X_tar shape",X_tar.shape)
print("y1_tar_cat shape",y1_tar_cat.shape)
X_tar shape (6000, 224, 224, 3) y1_tar_cat shape (6000, 2)
X_train, X_test, y_train, y_test = train_test_split(X_tar, y1_tar_cat, test_size=.20, stratify=y1_tar_cat, random_state=1) # 80% Training and 20% Testing
print("X_train",X_train.shape)
print("y_train",y_train.shape)
print("X_test",X_test.shape)
print("y_test",y_test.shape)
X_train (4800, 224, 224, 3) y_train (4800, 2) X_test (1200, 224, 224, 3) y_test (1200, 2)
X_train1, X_val, y_train1, y_val = train_test_split(X_train, y_train, test_size=.20, stratify=y_train, random_state=1) # 80% Training and 20% Testing
print("X_train1",X_train1.shape)
print("y_train1",y_train1.shape)
print("X_val",X_val.shape)
print("y_val",y_val.shape)
X_train1 (3840, 224, 224, 3) y_train1 (3840, 2) X_val (960, 224, 224, 3) y_val (960, 2)
print(X_val[0])
print(y_val[0])
[[[ 7 7 7] [16 16 16] [20 20 20] ... [18 18 18] [16 16 16] [16 16 16]] [[16 16 16] [28 28 28] [29 29 29] ... [23 23 23] [19 19 19] [19 19 19]] [[24 24 24] [37 37 37] [35 35 35] ... [26 26 26] [26 26 26] [23 23 23]] ... [[ 4 4 4] [ 5 5 5] [ 5 5 5] ... [ 5 5 5] [ 6 6 6] [ 6 6 6]] [[ 4 4 4] [ 4 4 4] [ 4 4 4] ... [ 4 4 4] [ 7 7 7] [ 7 7 7]] [[ 5 5 5] [ 4 4 4] [ 4 4 4] ... [ 4 4 4] [ 6 6 6] [ 6 6 6]]] [1. 0.]
batch_size = 60
nb_epochs = 10
history = model.fit(X_train1, y_train1,
batch_size=batch_size,
epochs=nb_epochs,
validation_data=(X_val, y_val),
initial_epoch=0, callbacks=[EarlyStopping(monitor='val_loss', patience=2, min_delta=0.001)])
Epoch 1/10 64/64 [==============================] - 44s 681ms/step - loss: 0.4550 - accuracy: 0.7818 - val_loss: 0.6614 - val_accuracy: 0.6990 Epoch 2/10 64/64 [==============================] - 47s 732ms/step - loss: 0.4115 - accuracy: 0.8094 - val_loss: 0.5971 - val_accuracy: 0.7312 Epoch 3/10 64/64 [==============================] - 46s 726ms/step - loss: 0.3861 - accuracy: 0.8227 - val_loss: 0.6779 - val_accuracy: 0.7167 Epoch 4/10 64/64 [==============================] - 46s 718ms/step - loss: 0.3687 - accuracy: 0.8310 - val_loss: 0.7029 - val_accuracy: 0.7031
batch_size = 60
nb_epochs = 10
history = model_2.fit(X_train1, y_train1,
batch_size=batch_size,
epochs=nb_epochs,
validation_data=(X_val, y_val),
initial_epoch=0, callbacks=[EarlyStopping(monitor='val_loss', patience=2, min_delta=0.001)])
Epoch 1/10 64/64 [==============================] - 48s 748ms/step - loss: 0.6006 - accuracy: 0.6977 - val_loss: 0.6436 - val_accuracy: 0.6969 Epoch 2/10 64/64 [==============================] - 47s 731ms/step - loss: 0.5322 - accuracy: 0.7435 - val_loss: 0.5589 - val_accuracy: 0.7240 Epoch 3/10 64/64 [==============================] - 46s 718ms/step - loss: 0.4791 - accuracy: 0.7750 - val_loss: 0.5601 - val_accuracy: 0.7240 Epoch 4/10 64/64 [==============================] - 46s 724ms/step - loss: 0.4472 - accuracy: 0.7857 - val_loss: 0.5573 - val_accuracy: 0.7281 Epoch 5/10 64/64 [==============================] - 46s 721ms/step - loss: 0.4129 - accuracy: 0.8115 - val_loss: 0.6046 - val_accuracy: 0.6865 Epoch 6/10 64/64 [==============================] - 46s 726ms/step - loss: 0.3530 - accuracy: 0.8391 - val_loss: 0.6024 - val_accuracy: 0.7208
batch_size = 64
nb_epochs = 10
history = model_2a.fit(X_train1, y_train1,
batch_size=batch_size,
epochs=nb_epochs,
validation_data=(X_val, y_val),
initial_epoch=0, callbacks=[EarlyStopping(monitor='val_loss', patience=2, min_delta=0.001)])
Epoch 1/10 60/60 [==============================] - 196s 3s/step - loss: 0.5938 - accuracy: 0.6948 - val_loss: 0.6786 - val_accuracy: 0.5938 Epoch 2/10 60/60 [==============================] - 193s 3s/step - loss: 0.5177 - accuracy: 0.7471 - val_loss: 0.6701 - val_accuracy: 0.6115 Epoch 3/10 60/60 [==============================] - 191s 3s/step - loss: 0.4903 - accuracy: 0.7589 - val_loss: 0.6044 - val_accuracy: 0.7167 Epoch 4/10 60/60 [==============================] - 191s 3s/step - loss: 0.4593 - accuracy: 0.7802 - val_loss: 0.6682 - val_accuracy: 0.6542 Epoch 5/10 60/60 [==============================] - 190s 3s/step - loss: 0.4254 - accuracy: 0.7945 - val_loss: 0.5609 - val_accuracy: 0.7292 Epoch 6/10 60/60 [==============================] - 190s 3s/step - loss: 0.3949 - accuracy: 0.8151 - val_loss: 0.5965 - val_accuracy: 0.7500 Epoch 7/10 60/60 [==============================] - 190s 3s/step - loss: 0.3667 - accuracy: 0.8349 - val_loss: 0.6143 - val_accuracy: 0.7167
batch_size = 64
nb_epochs = 10
history = model_3.fit(X_train1, y_train1,
batch_size=batch_size,
epochs=nb_epochs,
validation_data=(X_val, y_val),
initial_epoch=0, callbacks=[EarlyStopping(monitor='val_loss', patience=2, min_delta=0.001)])
Epoch 1/10 60/60 [==============================] - 177s 3s/step - loss: 0.5916 - accuracy: 0.6953 - val_loss: 0.5836 - val_accuracy: 0.7240 Epoch 2/10 60/60 [==============================] - 175s 3s/step - loss: 0.5511 - accuracy: 0.7253 - val_loss: 0.7166 - val_accuracy: 0.5406 Epoch 3/10 60/60 [==============================] - 175s 3s/step - loss: 0.5314 - accuracy: 0.7404 - val_loss: 0.5640 - val_accuracy: 0.7271 Epoch 4/10 60/60 [==============================] - 175s 3s/step - loss: 0.5120 - accuracy: 0.7479 - val_loss: 0.7512 - val_accuracy: 0.5542 Epoch 5/10 60/60 [==============================] - 154s 3s/step - loss: 0.5021 - accuracy: 0.7578 - val_loss: 0.5693 - val_accuracy: 0.7115
RESULTS = pd.DataFrame()
loss_1, accuracy_1 = model.evaluate(X_test,y_test)
RESULTS = RESULTS.append(pd.Series(["model",np.round((loss_1*100),2),np.round((accuracy_1*100),2)]),ignore_index=True)
loss_2, accuracy_2 = model_2.evaluate(X_test,y_test)
RESULTS = RESULTS.append(pd.Series(["model_2",np.round((loss_2*100),2),np.round((accuracy_2*100),2)]),ignore_index=True)
loss_2a, accuracy_2a = model_2a.evaluate(X_test,y_test)
RESULTS = RESULTS.append(pd.Series(["model_2a",np.round((loss_2a*100),2),np.round((accuracy_2a*100),2)]),ignore_index=True)
loss_3, accuracy_3 = model_3.evaluate(X_test,y_test)
RESULTS = RESULTS.append(pd.Series(["model_3",np.round((loss_3*100),2),np.round((accuracy_3*100),2)]),ignore_index=True)
38/38 [==============================] - 2s 48ms/step - loss: 0.7589 - accuracy: 0.6900 38/38 [==============================] - 2s 47ms/step - loss: 0.5970 - accuracy: 0.7050 38/38 [==============================] - 8s 198ms/step - loss: 0.6151 - accuracy: 0.6867 38/38 [==============================] - 7s 177ms/step - loss: 0.5346 - accuracy: 0.7383
RESULTS.columns = ['MODEL_NAME','LOSS','ACCURACY']
RESULTS.style.set_properties(**{'border': '1.3px solid black','color': 'blue'}).hide_index()
| MODEL_NAME | LOSS | ACCURACY |
|---|---|---|
| model | 75.890000 | 69.000000 |
| model_2 | 59.700000 | 70.500000 |
| model_2a | 61.510000 | 68.670000 |
| model_3 | 53.460000 | 73.830000 |
RESULTS.to_csv("FINE_TUNING_TO_JPG_RESULTS.csv")
# Convert JPEG images to 128*128
height = 128
width = 128
dim = (height, width)
X_target = []
brk = 0
i = 1 # initialisation
for img in tqdm(images_6000_jpg["Full_filename"].values):
try:
train_img_resize = cv2.resize(cv2.imread(img),dim, interpolation=cv2.INTER_LINEAR)
except:
brk+=1
print("breaking out for image ",img)
height_2, width_2, layers = train_img_resize.shape
size=(width_2,height_2)
X_target.append(train_img_resize)
i += 1
100%|█████████████████████████████████████████████████████████████████████████████| 6000/6000 [00:38<00:00, 155.36it/s]
X_tar = np.asarray(X_target)
y1_tar = images_6000_jpg['Image_Target']
y1_tar.value_counts()
0 3000 1 3000 Name: Image_Target, dtype: int64
y1_tar_cat = to_categorical(y1_tar, num_classes=2)
print("Shape of y1_tar_cat:", y1_tar_cat.shape)
Shape of y1_tar_cat: (6000, 2)
y1_tar_cat[0:10]
array([[1., 0.],
[1., 0.],
[1., 0.],
[1., 0.],
[1., 0.],
[1., 0.],
[1., 0.],
[1., 0.],
[1., 0.],
[1., 0.]], dtype=float32)
print("X_tar shape",X_tar.shape)
print("y1_tar_cat shape",y1_tar_cat.shape)
X_tar shape (6000, 128, 128, 3) y1_tar_cat shape (6000, 2)
X_train, X_test, y_train, y_test = train_test_split(X_tar, y1_tar_cat, test_size=.20, stratify=y1_tar_cat, random_state=1) # 80% Training and 20% Testing
print("X_train",X_train.shape)
print("y_train",y_train.shape)
X_train (4800, 128, 128, 3) y_train (4800, 2)
print("X_test",X_test.shape)
print("y_test",y_test.shape)
X_test (1200, 128, 128, 3) y_test (1200, 2)
X_train1, X_val, y_train1, y_val = train_test_split(X_train, y_train, test_size=.20, stratify=y_train, random_state=1) # 80% Training and 20% Testing
print("X_train1",X_train1.shape)
print("y_train1",y_train1.shape)
print("X_val",X_val.shape)
print("y_val",y_val.shape)
X_train1 (3840, 128, 128, 3) y_train1 (3840, 2) X_val (960, 128, 128, 3) y_val (960, 2)
print(X_val[0])
print(y_val[0])
[[[18 18 18] [25 25 25] [28 28 28] ... [22 22 22] [19 19 19] [15 15 15]] [[34 34 34] [36 36 36] [41 41 41] ... [30 30 30] [27 27 27] [24 24 24]] [[42 42 42] [45 45 45] [46 46 46] ... [41 41 41] [37 37 37] [33 33 33]] ... [[ 5 5 5] [ 5 5 5] [ 5 5 5] ... [ 5 5 5] [ 5 5 5] [ 6 6 6]] [[ 5 5 5] [ 5 5 5] [ 5 5 5] ... [ 5 5 5] [ 5 5 5] [ 6 6 6]] [[ 6 6 6] [ 6 6 6] [ 5 5 5] ... [ 6 6 6] [ 6 6 6] [ 5 5 5]]] [1. 0.]
tf.keras.backend.clear_session()
# Initialize the model
model = Sequential()
# Add a Convolutional Layer with 32 filters of size 7X7, strides(2,2) and activation function as 'relu'
model.add(Conv2D(filters=32, kernel_size=7, strides=(2, 2), activation="relu", input_shape=(128, 128, 3)))
model.add(MaxPooling2D(pool_size=(2, 2)))
model.add(BatchNormalization())
# Add a Convolutional Layer with 64 filters of size 5X5, strides(2,2) and activation function as 'relu'
model.add(Conv2D(filters=64, kernel_size=5, strides=(1, 1), activation="relu"))
model.add(BatchNormalization())
model.add(MaxPooling2D(pool_size=(2, 2)))
# model.add(Dropout(rate=0.3))
# model.add(GlobalMaxPooling2D())
# Apply Dropout with 0.2 probability
# model.add(Dropout(rate=0.2))
# Add a Convolutional Layer with 64 filters of size 3X3 and activation function as 'relu'
# model.add(Conv2D(filters=64, kernel_size=3, strides=(2, 2), activation="relu"))
# model.add(MaxPooling2D(pool_size=(2, 2)))
# model.add(BatchNormalization())
# Flatten the layer
model.add(Flatten())
# Add Fully Connected Layer with 128 units and activation function as 'relu'
model.add(Dense(128, activation="relu"))
model.add(Dense(64, activation="relu"))
# model.add(Dropout(rate=0.3))
# model.add(BatchNormalization())
# model.add(Dense(128, activation="relu"))
#Add Fully Connected Layer with 3 units and activation function as 'softmax'
model.add(Dense(2, activation="sigmoid"))
optimizer = Adam(learning_rate=0.01)
model.compile(optimizer = optimizer , loss = "binary_crossentropy", metrics=["accuracy"])
model.summary()
Model: "sequential"
_________________________________________________________________
Layer (type) Output Shape Param #
=================================================================
conv2d (Conv2D) (None, 109, 109, 32) 4736
max_pooling2d (MaxPooling2D (None, 54, 54, 32) 0
)
batch_normalization (BatchN (None, 54, 54, 32) 128
ormalization)
conv2d_1 (Conv2D) (None, 50, 50, 64) 51264
batch_normalization_1 (Batc (None, 50, 50, 64) 256
hNormalization)
max_pooling2d_1 (MaxPooling (None, 25, 25, 64) 0
2D)
flatten (Flatten) (None, 40000) 0
dense (Dense) (None, 128) 5120128
dense_1 (Dense) (None, 64) 8256
dense_2 (Dense) (None, 2) 130
=================================================================
Total params: 5,184,898
Trainable params: 5,184,706
Non-trainable params: 192
_________________________________________________________________
batch_size = 64
nb_epochs = 10
history = model.fit(X_train1, y_train1,
batch_size=batch_size,
epochs=nb_epochs,
validation_data=(X_val, y_val),
initial_epoch=0, callbacks=[EarlyStopping(monitor='val_loss', patience=2, min_delta=0.001)])
Epoch 1/10 60/60 [==============================] - 11s 171ms/step - loss: 2.0457 - accuracy: 0.6555 - val_loss: 10.9216 - val_accuracy: 0.6021 Epoch 2/10 60/60 [==============================] - 10s 161ms/step - loss: 0.5977 - accuracy: 0.7036 - val_loss: 1.5250 - val_accuracy: 0.6250 Epoch 3/10 60/60 [==============================] - 10s 160ms/step - loss: 0.5896 - accuracy: 0.6883 - val_loss: 0.7360 - val_accuracy: 0.6990 Epoch 4/10 60/60 [==============================] - 10s 160ms/step - loss: 0.5509 - accuracy: 0.7201 - val_loss: 0.6632 - val_accuracy: 0.7021 Epoch 5/10 60/60 [==============================] - 10s 165ms/step - loss: 0.5440 - accuracy: 0.7320 - val_loss: 0.5619 - val_accuracy: 0.7312 Epoch 6/10 60/60 [==============================] - 10s 175ms/step - loss: 0.5294 - accuracy: 0.7352 - val_loss: 0.5498 - val_accuracy: 0.7312 Epoch 7/10 60/60 [==============================] - 10s 169ms/step - loss: 0.5253 - accuracy: 0.7411 - val_loss: 0.5506 - val_accuracy: 0.7302 Epoch 8/10 60/60 [==============================] - 10s 168ms/step - loss: 0.5251 - accuracy: 0.7331 - val_loss: 0.5508 - val_accuracy: 0.7510
tf.keras.backend.clear_session()
# second model
model_2 = Sequential()
model_2.add(Conv2D(filters=32, kernel_size=7, strides=(2, 2), activation="relu", input_shape=(128, 128, 3)))
model_2.add(MaxPooling2D(pool_size=(2, 2)))
model_2.add(BatchNormalization())
model_2.add(Conv2D(filters=64, kernel_size=5, strides=(1, 1), activation="relu"))
model_2.add(MaxPooling2D(pool_size=(2, 2)))
model_2.add(BatchNormalization())
#model_2.add(Dropout(rate=0.3))
model_2.add(Conv2D(filters=128, kernel_size=3, strides=(1, 1), activation="relu"))
model_2.add(MaxPooling2D(pool_size=(2, 2)))
model_2.add(BatchNormalization())
model_2.add(Flatten())
model_2.add(Dense(128, activation="relu"))
# model_2.add(BatchNormalization())
model_2.add(Dense(84, activation="relu"))
# model_2.add(BatchNormalization())
model_2.add(Dense(42, activation="relu"))
model_2.add(Dense(2, activation="sigmoid"))
optimizer = Adam(learning_rate=0.001)
model_2.compile(optimizer = optimizer , loss = "binary_crossentropy", metrics=["accuracy"])
model_2.summary()
Model: "sequential"
_________________________________________________________________
Layer (type) Output Shape Param #
=================================================================
conv2d (Conv2D) (None, 61, 61, 32) 4736
max_pooling2d (MaxPooling2D (None, 30, 30, 32) 0
)
batch_normalization (BatchN (None, 30, 30, 32) 128
ormalization)
conv2d_1 (Conv2D) (None, 26, 26, 64) 51264
max_pooling2d_1 (MaxPooling (None, 13, 13, 64) 0
2D)
batch_normalization_1 (Batc (None, 13, 13, 64) 256
hNormalization)
conv2d_2 (Conv2D) (None, 11, 11, 128) 73856
max_pooling2d_2 (MaxPooling (None, 5, 5, 128) 0
2D)
batch_normalization_2 (Batc (None, 5, 5, 128) 512
hNormalization)
flatten (Flatten) (None, 3200) 0
dense (Dense) (None, 128) 409728
dense_1 (Dense) (None, 84) 10836
dense_2 (Dense) (None, 42) 3570
dense_3 (Dense) (None, 2) 86
=================================================================
Total params: 554,972
Trainable params: 554,524
Non-trainable params: 448
_________________________________________________________________
batch_size = 64
nb_epochs = 10
history_2 = model_2.fit(X_train1, y_train1,
batch_size=batch_size,
epochs=nb_epochs,
validation_data=(X_val, y_val),
initial_epoch=0, callbacks=[EarlyStopping(monitor='val_loss', patience=3, min_delta=0.0001)])
Epoch 1/10 60/60 [==============================] - 11s 174ms/step - loss: 0.6044 - accuracy: 0.6930 - val_loss: 1.5019 - val_accuracy: 0.5000 Epoch 2/10 60/60 [==============================] - 10s 168ms/step - loss: 0.5583 - accuracy: 0.7234 - val_loss: 0.8792 - val_accuracy: 0.5875 Epoch 3/10 60/60 [==============================] - 10s 170ms/step - loss: 0.5395 - accuracy: 0.7365 - val_loss: 0.6353 - val_accuracy: 0.7063 Epoch 4/10 60/60 [==============================] - 10s 171ms/step - loss: 0.5238 - accuracy: 0.7497 - val_loss: 0.9062 - val_accuracy: 0.5521 Epoch 5/10 60/60 [==============================] - 10s 171ms/step - loss: 0.5115 - accuracy: 0.7529 - val_loss: 0.5849 - val_accuracy: 0.6875 Epoch 6/10 60/60 [==============================] - 10s 171ms/step - loss: 0.4860 - accuracy: 0.7628 - val_loss: 0.5983 - val_accuracy: 0.6969 Epoch 7/10 60/60 [==============================] - 10s 172ms/step - loss: 0.4744 - accuracy: 0.7716 - val_loss: 0.6039 - val_accuracy: 0.7083 Epoch 8/10 60/60 [==============================] - 10s 175ms/step - loss: 0.4432 - accuracy: 0.7878 - val_loss: 0.6395 - val_accuracy: 0.7125
tf.keras.backend.clear_session()
# second model
model_2a = Sequential()
model_2a.add(Conv2D(filters=32, kernel_size=5, strides=(1, 1), activation="relu", input_shape=(128, 128, 3)))
model_2a.add(MaxPooling2D(pool_size=(2, 2)))
model_2a.add(BatchNormalization())
model_2a.add(Conv2D(filters=64, kernel_size=5, strides=(1, 1), activation="relu"))
model_2a.add(MaxPooling2D(pool_size=(2, 2)))
model_2a.add(BatchNormalization())
model_2a.add(Conv2D(filters=128, kernel_size=3, strides=(1, 1), activation="relu"))
model_2a.add(MaxPooling2D(pool_size=(2, 2)))
model_2a.add(BatchNormalization())
model_2a.add(Dropout(rate=0.2))
model_2a.add(Conv2D(filters=256, kernel_size=3, strides=(1, 1), activation="relu"))
model_2a.add(MaxPooling2D(pool_size=(2, 2)))
model_2a.add(BatchNormalization())
model_2a.add(Dropout(rate=0.2))
#model_2a.add(Conv2D(filters=512, kernel_size=3, strides=(1, 1), activation="relu"))
#model_2a.add(MaxPooling2D(pool_size=(2, 2)))
#model_2a.add(BatchNormalization())
model_2a.add(Flatten())
#model_2a.add(Dense(512, activation="relu"))
# model_2a.add(Dropout(rate=0.7))
model_2a.add(Dense(256, activation="relu"))
model_2a.add(Dropout(rate=0.25))
model_2a.add(Dense(128, activation="relu"))
# model_2a.add(Dropout(rate=0.3))
model_2a.add(Dense(64, activation="relu"))
model_2a.add(Dense(32, activation="relu"))
model_2a.add(Dense(2, activation="sigmoid"))
optimizer = Adam(learning_rate=0.0001)
model_2a.compile(optimizer = optimizer , loss = "categorical_crossentropy", metrics=["accuracy"])
model_2a.summary()
Model: "sequential"
_________________________________________________________________
Layer (type) Output Shape Param #
=================================================================
conv2d (Conv2D) (None, 124, 124, 32) 2432
max_pooling2d (MaxPooling2D (None, 62, 62, 32) 0
)
batch_normalization (BatchN (None, 62, 62, 32) 128
ormalization)
conv2d_1 (Conv2D) (None, 58, 58, 64) 51264
max_pooling2d_1 (MaxPooling (None, 29, 29, 64) 0
2D)
batch_normalization_1 (Batc (None, 29, 29, 64) 256
hNormalization)
conv2d_2 (Conv2D) (None, 27, 27, 128) 73856
max_pooling2d_2 (MaxPooling (None, 13, 13, 128) 0
2D)
batch_normalization_2 (Batc (None, 13, 13, 128) 512
hNormalization)
dropout (Dropout) (None, 13, 13, 128) 0
conv2d_3 (Conv2D) (None, 11, 11, 256) 295168
max_pooling2d_3 (MaxPooling (None, 5, 5, 256) 0
2D)
batch_normalization_3 (Batc (None, 5, 5, 256) 1024
hNormalization)
dropout_1 (Dropout) (None, 5, 5, 256) 0
flatten (Flatten) (None, 6400) 0
dense (Dense) (None, 256) 1638656
dropout_2 (Dropout) (None, 256) 0
dense_1 (Dense) (None, 128) 32896
dense_2 (Dense) (None, 64) 8256
dense_3 (Dense) (None, 32) 2080
dense_4 (Dense) (None, 2) 66
=================================================================
Total params: 2,106,594
Trainable params: 2,105,634
Non-trainable params: 960
_________________________________________________________________
batch_size = 64
nb_epochs = 10
history_2a = model_2a.fit(X_train1, y_train1,
batch_size=batch_size,
epochs=nb_epochs,
validation_data=(X_val, y_val),
initial_epoch=0, callbacks=[EarlyStopping(monitor='val_loss', patience=3, min_delta=0.01)])
Epoch 1/10 60/60 [==============================] - 43s 704ms/step - loss: 0.6161 - accuracy: 0.6706 - val_loss: 0.6527 - val_accuracy: 0.6875 Epoch 2/10 60/60 [==============================] - 45s 748ms/step - loss: 0.5679 - accuracy: 0.7182 - val_loss: 0.6016 - val_accuracy: 0.7125 Epoch 3/10 60/60 [==============================] - 47s 783ms/step - loss: 0.5503 - accuracy: 0.7214 - val_loss: 0.5577 - val_accuracy: 0.7385 Epoch 4/10 60/60 [==============================] - 47s 789ms/step - loss: 0.5364 - accuracy: 0.7318 - val_loss: 0.5452 - val_accuracy: 0.7375 Epoch 5/10 60/60 [==============================] - 48s 792ms/step - loss: 0.5254 - accuracy: 0.7383 - val_loss: 0.5456 - val_accuracy: 0.7406 Epoch 6/10 60/60 [==============================] - 48s 801ms/step - loss: 0.5046 - accuracy: 0.7526 - val_loss: 0.5574 - val_accuracy: 0.7260 Epoch 7/10 60/60 [==============================] - 48s 800ms/step - loss: 0.5071 - accuracy: 0.7508 - val_loss: 0.5760 - val_accuracy: 0.7229
tf.keras.backend.clear_session()
# second model
model_3 = Sequential()
model_3.add(Conv2D(filters=32, kernel_size=5, strides=(1, 1), activation="relu", input_shape=(128, 128, 3)))
model_3.add(MaxPooling2D(pool_size=(2, 2)))
model_3.add(Conv2D(filters=64, kernel_size=5, strides=(1, 1), kernel_initializer='normal', activation="relu"))
model_3.add(MaxPooling2D(pool_size=(2, 2)))
model_3.add(Conv2D(filters=128, kernel_size=3, strides=(1, 1), kernel_initializer='normal', activation="relu"))
model_3.add(MaxPooling2D(pool_size=(2, 2)))
model_3.add(BatchNormalization())
model_3.add(Conv2D(filters=256, kernel_size=3, strides=(1, 1), kernel_initializer='normal', activation="relu"))
model_3.add(MaxPooling2D(pool_size=(2, 2)))
model_3.add(Dropout(rate=0.3))
model_3.add(Flatten())
model_3.add(Dense(128, kernel_initializer='normal', activation="relu"))
model_3.add(Dense(84, kernel_initializer='normal', activation="relu"))
model_3.add(Dense(42, kernel_initializer='normal', activation="relu"))
model_3.add(Dense(2, activation="sigmoid"))
optimizer = Adam(learning_rate=0.0001)
model_3.compile(optimizer = optimizer , loss = "binary_crossentropy", metrics=["accuracy"])
model_3.summary()
Model: "sequential"
_________________________________________________________________
Layer (type) Output Shape Param #
=================================================================
conv2d (Conv2D) (None, 124, 124, 32) 2432
max_pooling2d (MaxPooling2D (None, 62, 62, 32) 0
)
conv2d_1 (Conv2D) (None, 58, 58, 64) 51264
max_pooling2d_1 (MaxPooling (None, 29, 29, 64) 0
2D)
conv2d_2 (Conv2D) (None, 27, 27, 128) 73856
max_pooling2d_2 (MaxPooling (None, 13, 13, 128) 0
2D)
batch_normalization (BatchN (None, 13, 13, 128) 512
ormalization)
conv2d_3 (Conv2D) (None, 11, 11, 256) 295168
max_pooling2d_3 (MaxPooling (None, 5, 5, 256) 0
2D)
dropout (Dropout) (None, 5, 5, 256) 0
flatten (Flatten) (None, 6400) 0
dense (Dense) (None, 128) 819328
dense_1 (Dense) (None, 84) 10836
dense_2 (Dense) (None, 42) 3570
dense_3 (Dense) (None, 2) 86
=================================================================
Total params: 1,257,052
Trainable params: 1,256,796
Non-trainable params: 256
_________________________________________________________________
batch_size = 64
nb_epochs = 10
history_3 = model_3.fit(X_train1, y_train1,
batch_size=batch_size,
epochs=nb_epochs,
validation_data=(X_val, y_val),
initial_epoch=0, callbacks=[EarlyStopping(monitor='val_loss', patience=3, min_delta=0.001)])
Epoch 1/10 60/60 [==============================] - 38s 621ms/step - loss: 0.6269 - accuracy: 0.6630 - val_loss: 0.6533 - val_accuracy: 0.6510 Epoch 2/10 60/60 [==============================] - 42s 703ms/step - loss: 0.5656 - accuracy: 0.7167 - val_loss: 0.6310 - val_accuracy: 0.6406 Epoch 3/10 60/60 [==============================] - 43s 716ms/step - loss: 0.5524 - accuracy: 0.7279 - val_loss: 0.5767 - val_accuracy: 0.7219 Epoch 4/10 60/60 [==============================] - 43s 715ms/step - loss: 0.5385 - accuracy: 0.7365 - val_loss: 0.5473 - val_accuracy: 0.7354 Epoch 5/10 60/60 [==============================] - 43s 711ms/step - loss: 0.5307 - accuracy: 0.7388 - val_loss: 0.5597 - val_accuracy: 0.7198 Epoch 6/10 60/60 [==============================] - 43s 715ms/step - loss: 0.5076 - accuracy: 0.7560 - val_loss: 0.5916 - val_accuracy: 0.7021 Epoch 7/10 60/60 [==============================] - 43s 712ms/step - loss: 0.5026 - accuracy: 0.7578 - val_loss: 0.5732 - val_accuracy: 0.7208
RESULTS = pd.DataFrame()
loss_1, accuracy_1 = model.evaluate(X_test,y_test)
RESULTS = RESULTS.append(pd.Series(["model",np.round((loss_1*100),2),np.round((accuracy_1*100),2)]),ignore_index=True)
loss_2, accuracy_2 = model_2.evaluate(X_test,y_test)
RESULTS = RESULTS.append(pd.Series(["model_2",np.round((loss_2*100),2),np.round((accuracy_2*100),2)]),ignore_index=True)
loss_2a, accuracy_2a = model_2a.evaluate(X_test,y_test)
RESULTS = RESULTS.append(pd.Series(["model_2a",np.round((loss_2a*100),2),np.round((accuracy_2a*100),2)]),ignore_index=True)
loss_3, accuracy_3 = model_3.evaluate(X_test,y_test)
RESULTS = RESULTS.append(pd.Series(["model_3",np.round((loss_3*100),2),np.round((accuracy_3*100),2)]),ignore_index=True)
38/38 [==============================] - 1s 16ms/step - loss: 0.5358 - accuracy: 0.7325 38/38 [==============================] - 1s 18ms/step - loss: 0.6166 - accuracy: 0.7183 38/38 [==============================] - 3s 62ms/step - loss: 0.5806 - accuracy: 0.7058 38/38 [==============================] - 2s 55ms/step - loss: 0.5660 - accuracy: 0.7225
RESULTS.columns = ['MODEL_NAME','LOSS','ACCURACY']
RESULTS.style.set_properties(**{'border': '1.3px solid black','color': 'blue'}).hide_index()
| MODEL_NAME | LOSS | ACCURACY |
|---|---|---|
| model | 53.580000 | 73.250000 |
| model_2 | 61.660000 | 71.830000 |
| model_2a | 58.060000 | 70.580000 |
| model_3 | 56.600000 | 72.250000 |
RESULTS.to_csv("JPG_128_128_LOSS_ACCURACY.csv",index = False)
* Observations:
- On reducing the jpeg images size to 128x128, there is no much improvement in accuracy.
- It is expected because, some times reducing the size too much may lead loss of important information.
- This is because, we already converted images from dicom to jpeg which might be reason for no improvement in accuracy.
- We can observe the model training speed is very faster because of size reduction and less memory usage.
img_rows=128
img_cols=128
dim = (img_rows, img_cols)
X_target = []
brk = 0
i = 1 # initialisation
for img in tqdm(Images_6000_Target["Full_filename"].values):
ds_3 = dicom.dcmread(img)
img_3 = ds_3.pixel_array
rgb = apply_color_lut(img_3, palette='PET')
train_img = rgb
try:
train_img_resize = cv2.resize(train_img, dim, interpolation=cv2.INTER_LINEAR)
except:
brk +=1
print("breaking out for",img_3)
break
height_2, width_2, layers = train_img_resize.shape
size=(width_2,height_2)
X_target.append(train_img_resize)
i += 1
100%|██████████████████████████████████████████████████████████████████████████████| 6000/6000 [02:36<00:00, 38.33it/s]
X_tar = np.asarray(X_target)
y1_tar = Images_6000_Target['Image_Target']
y1_tar.value_counts()
0 3000 1 3000 Name: Image_Target, dtype: int64
y1_tar_cat = to_categorical(y1_tar, num_classes=2)
print("Shape of y1_tar_cat:", y1_tar_cat.shape)
Shape of y1_tar_cat: (6000, 2)
print("X_tar shape",X_tar.shape)
print("y1_tar_cat shape",y1_tar_cat.shape)
X_tar shape (6000, 128, 128, 3) y1_tar_cat shape (6000, 2)
X_train, X_test, y_train, y_test = train_test_split(X_tar, y1_tar_cat, test_size=.20, stratify=y1_tar_cat, random_state=1) # 80% Training and 20% Testing
print("X_train",X_train.shape)
print("y_train",y_train.shape)
print("X_test",X_test.shape)
print("y_test",y_test.shape)
X_train (4800, 128, 128, 3) y_train (4800, 2) X_test (1200, 128, 128, 3) y_test (1200, 2)
X_train1, X_val, y_train1, y_val = train_test_split(X_train, y_train, test_size=.20, stratify=y_train, random_state=1) # 80% Training and 20% Testing
print("X_train1",X_train1.shape)
print("y_train1",y_train1.shape)
print("X_val",X_val.shape)
print("y_val",y_val.shape)
X_train1 (3840, 128, 128, 3) y_train1 (3840, 2) X_val (960, 128, 128, 3) y_val (960, 2)
print(X_val[0])
print(y_val[0])
[[[ 0 97 95] [ 0 47 46] [ 0 19 18] ... [ 0 66 65] [ 0 66 64] [ 0 80 78]] [[ 0 92 90] [ 0 44 43] [ 0 16 15] ... [ 0 12 11] [ 0 14 13] [ 0 23 22]] [[ 0 92 90] [ 0 45 44] [ 0 14 13] ... [ 0 0 0] [ 0 0 0] [ 0 6 5]] ... [[ 0 6 5] [ 0 0 0] [ 0 0 0] ... [ 0 3 2] [ 0 26 25] [ 0 63 61]] [[ 0 15 14] [ 0 7 6] [ 0 7 6] ... [ 0 3 2] [ 0 24 23] [ 0 64 62]] [[ 0 22 21] [ 0 24 23] [ 0 17 16] ... [ 0 3 2] [ 0 27 26] [ 0 65 63]]] [1. 0.]
batch_size = 64
nb_epochs = 10
history = model.fit(X_train1, y_train1,
batch_size=batch_size,
epochs=nb_epochs,
validation_data=(X_val, y_val),
initial_epoch=0, callbacks=[EarlyStopping(monitor='val_loss', patience=2, min_delta=0.001)])
Epoch 1/10 60/60 [==============================] - 30s 498ms/step - loss: 0.5331 - accuracy: 0.7383 - val_loss: 0.5408 - val_accuracy: 0.7396 Epoch 2/10 60/60 [==============================] - 29s 488ms/step - loss: 0.5188 - accuracy: 0.7427 - val_loss: 0.5625 - val_accuracy: 0.7437 Epoch 3/10 60/60 [==============================] - 29s 490ms/step - loss: 0.5063 - accuracy: 0.7417 - val_loss: 0.5367 - val_accuracy: 0.7542 Epoch 4/10 60/60 [==============================] - 29s 489ms/step - loss: 0.4823 - accuracy: 0.7615 - val_loss: 0.5322 - val_accuracy: 0.7417 Epoch 5/10 60/60 [==============================] - 29s 490ms/step - loss: 0.4720 - accuracy: 0.7638 - val_loss: 0.5391 - val_accuracy: 0.7510 Epoch 6/10 60/60 [==============================] - 29s 486ms/step - loss: 0.4721 - accuracy: 0.7651 - val_loss: 0.5544 - val_accuracy: 0.7542
batch_size = 64
nb_epochs = 10
history = model_2.fit(X_train1, y_train1,
batch_size=batch_size,
epochs=nb_epochs,
validation_data=(X_val, y_val),
initial_epoch=0, callbacks=[EarlyStopping(monitor='val_loss', patience=2, min_delta=0.001)])
Epoch 1/10 60/60 [==============================] - 31s 523ms/step - loss: 0.5176 - accuracy: 0.7484 - val_loss: 0.5956 - val_accuracy: 0.7125 Epoch 2/10 60/60 [==============================] - 31s 517ms/step - loss: 0.4639 - accuracy: 0.7750 - val_loss: 0.5294 - val_accuracy: 0.7552 Epoch 3/10 60/60 [==============================] - 32s 536ms/step - loss: 0.4211 - accuracy: 0.8042 - val_loss: 0.5347 - val_accuracy: 0.7573 Epoch 4/10 60/60 [==============================] - 32s 531ms/step - loss: 0.4129 - accuracy: 0.8083 - val_loss: 0.5951 - val_accuracy: 0.7063
batch_size = 64
nb_epochs = 10
history = model_2a.fit(X_train1, y_train1,
batch_size=batch_size,
epochs=nb_epochs,
validation_data=(X_val, y_val),
initial_epoch=0, callbacks=[EarlyStopping(monitor='val_loss', patience=2, min_delta=0.001)])
Epoch 1/10 60/60 [==============================] - 137s 2s/step - loss: 0.5868 - accuracy: 0.6919 - val_loss: 0.5295 - val_accuracy: 0.7510 Epoch 2/10 60/60 [==============================] - 141s 2s/step - loss: 0.5361 - accuracy: 0.7323 - val_loss: 0.5266 - val_accuracy: 0.7469 Epoch 3/10 60/60 [==============================] - 143s 2s/step - loss: 0.5199 - accuracy: 0.7424 - val_loss: 0.5092 - val_accuracy: 0.7552 Epoch 4/10 60/60 [==============================] - 139s 2s/step - loss: 0.5071 - accuracy: 0.7589 - val_loss: 0.5193 - val_accuracy: 0.7510 Epoch 5/10 60/60 [==============================] - 140s 2s/step - loss: 0.4805 - accuracy: 0.7661 - val_loss: 0.5187 - val_accuracy: 0.7625
batch_size = 64
nb_epochs = 10
history = model_3.fit(X_train1, y_train1,
batch_size=batch_size,
epochs=nb_epochs,
validation_data=(X_val, y_val),
initial_epoch=0, callbacks=[EarlyStopping(monitor='val_loss', patience=2, min_delta=0.001)])
Epoch 1/10 60/60 [==============================] - 135s 2s/step - loss: 0.5936 - accuracy: 0.6911 - val_loss: 0.6041 - val_accuracy: 0.6969 Epoch 2/10 60/60 [==============================] - 134s 2s/step - loss: 0.5390 - accuracy: 0.7310 - val_loss: 0.5610 - val_accuracy: 0.7302 Epoch 3/10 60/60 [==============================] - 133s 2s/step - loss: 0.5317 - accuracy: 0.7370 - val_loss: 0.6009 - val_accuracy: 0.6823 Epoch 4/10 60/60 [==============================] - 133s 2s/step - loss: 0.5085 - accuracy: 0.7531 - val_loss: 0.5305 - val_accuracy: 0.7490 Epoch 5/10 60/60 [==============================] - 133s 2s/step - loss: 0.4935 - accuracy: 0.7596 - val_loss: 0.5251 - val_accuracy: 0.7510 Epoch 6/10 60/60 [==============================] - 137s 2s/step - loss: 0.4869 - accuracy: 0.7664 - val_loss: 0.5264 - val_accuracy: 0.7510 Epoch 7/10 60/60 [==============================] - 146s 2s/step - loss: 0.4746 - accuracy: 0.7745 - val_loss: 0.5238 - val_accuracy: 0.7510 Epoch 8/10 60/60 [==============================] - 147s 2s/step - loss: 0.4500 - accuracy: 0.7943 - val_loss: 0.5436 - val_accuracy: 0.7354 Epoch 9/10 60/60 [==============================] - 146s 2s/step - loss: 0.4433 - accuracy: 0.7945 - val_loss: 0.5415 - val_accuracy: 0.7458
RESULTS = pd.DataFrame()
loss_1, accuracy_1 = model.evaluate(X_test,y_test)
RESULTS = RESULTS.append(pd.Series(["model",np.round((loss_1*100),2),np.round((accuracy_1*100),2)]),ignore_index=True)
loss_2, accuracy_2 = model_2.evaluate(X_test,y_test)
RESULTS = RESULTS.append(pd.Series(["model_2",np.round((loss_2*100),2),np.round((accuracy_2*100),2)]),ignore_index=True)
loss_2a, accuracy_2a = model_2a.evaluate(X_test,y_test)
RESULTS = RESULTS.append(pd.Series(["model_2a",np.round((loss_2a*100),2),np.round((accuracy_2a*100),2)]),ignore_index=True)
loss_3, accuracy_3 = model_3.evaluate(X_test,y_test)
RESULTS = RESULTS.append(pd.Series(["model_3",np.round((loss_3*100),2),np.round((accuracy_3*100),2)]),ignore_index=True)
RESULTS.columns = ['MODEL_NAME','LOSS','ACCURACY']
RESULTS.style.set_properties(**{'border': '1.3px solid black','color': 'blue'}).hide_index()
38/38 [==============================] - 2s 46ms/step - loss: 0.5507 - accuracy: 0.7383 38/38 [==============================] - 2s 43ms/step - loss: 0.6108 - accuracy: 0.6792 38/38 [==============================] - 7s 189ms/step - loss: 0.5465 - accuracy: 0.7308 38/38 [==============================] - 7s 174ms/step - loss: 0.5600 - accuracy: 0.7325
| MODEL_NAME | LOSS | ACCURACY |
|---|---|---|
| model | 55.070000 | 73.830000 |
| model_2 | 61.080000 | 67.920000 |
| model_2a | 54.650000 | 73.080000 |
| model_3 | 56.000000 | 73.250000 |
RESULTS.to_csv("DICOM_128x128_Accuracy_Loss.csv",index = False)
* Use 224x224 Dicom images data for training the model.
* Add beta and epsilon in the Adam optimizer
* Change to categorical_crossentropy from binary_crossentropy
* Use models where input_size is 224x224
image_X_3 = open ("Images_X_3_target_6000.pkl", "rb")
X = pkl.load(image_X_3)
X_tar = np.array(X)
y1_tar = Images_6000_Target['Image_Target']
y1_tar_cat = to_categorical(y1_tar, num_classes=2)
X_train, X_test, y_train, y_test = train_test_split(X_tar, y1_tar_cat, test_size=.20, stratify=y1_tar_cat, random_state=1) # 80% Training and 20% Testing
X_train1, X_val, y_train1, y_val = train_test_split(X_train, y_train, test_size=.20, stratify=y_train, random_state=1) # 80% Training and 20% validation
print("X_train",X_train.shape)
print("y_train",y_train.shape)
print("")
print("X_test",X_test.shape)
print("y_test",y_test.shape)
print("")
print("X_val",X_val.shape)
print("y_val",y_val.shape)
X_train (4800, 224, 224, 3) y_train (4800, 2) X_test (1200, 224, 224, 3) y_test (1200, 2) X_val (960, 224, 224, 3) y_val (960, 2)
optimizer = Adam(learning_rate=0.00001, beta_1=0.9, beta_2=0.999, epsilon=1e-07)
model.compile(optimizer = optimizer , loss = "categorical_crossentropy", metrics=["accuracy"])
model.summary()
Model: "sequential"
_________________________________________________________________
Layer (type) Output Shape Param #
=================================================================
conv2d (Conv2D) (None, 109, 109, 32) 4736
max_pooling2d (MaxPooling2D (None, 54, 54, 32) 0
)
batch_normalization (BatchN (None, 54, 54, 32) 128
ormalization)
conv2d_1 (Conv2D) (None, 50, 50, 64) 51264
batch_normalization_1 (Batc (None, 50, 50, 64) 256
hNormalization)
max_pooling2d_1 (MaxPooling (None, 25, 25, 64) 0
2D)
flatten (Flatten) (None, 40000) 0
dense (Dense) (None, 128) 5120128
dense_1 (Dense) (None, 64) 8256
dense_2 (Dense) (None, 2) 130
=================================================================
Total params: 5,184,898
Trainable params: 5,184,706
Non-trainable params: 192
_________________________________________________________________
batch_size = 64
nb_epochs = 10
history = model.fit(X_train1, y_train1,
batch_size=batch_size,
epochs=nb_epochs,
validation_data=(X_val, y_val),
initial_epoch=0, callbacks=[EarlyStopping(monitor='val_loss', patience=2, min_delta=0.001)])
Epoch 1/10 60/60 [==============================] - 130s 2s/step - loss: 0.6349 - accuracy: 0.6740 - val_loss: 0.6424 - val_accuracy: 0.6448 Epoch 2/10 60/60 [==============================] - 128s 2s/step - loss: 0.4961 - accuracy: 0.7622 - val_loss: 0.5932 - val_accuracy: 0.6854 Epoch 3/10 60/60 [==============================] - 130s 2s/step - loss: 0.4248 - accuracy: 0.8089 - val_loss: 0.5791 - val_accuracy: 0.7021 Epoch 4/10 60/60 [==============================] - 129s 2s/step - loss: 0.3636 - accuracy: 0.8497 - val_loss: 0.5531 - val_accuracy: 0.7240 Epoch 5/10 60/60 [==============================] - 128s 2s/step - loss: 0.3106 - accuracy: 0.8872 - val_loss: 0.5593 - val_accuracy: 0.7156 Epoch 6/10 60/60 [==============================] - 126s 2s/step - loss: 0.2639 - accuracy: 0.9143 - val_loss: 0.5699 - val_accuracy: 0.7229
optimizer = Adam(learning_rate=0.00001, beta_1=0.9, beta_2=0.999, epsilon=1e-07)
model_2.compile(optimizer = optimizer , loss = "categorical_crossentropy", metrics=["accuracy"])
model_2.summary()
Model: "sequential"
_________________________________________________________________
Layer (type) Output Shape Param #
=================================================================
conv2d (Conv2D) (None, 109, 109, 32) 4736
max_pooling2d (MaxPooling2D (None, 54, 54, 32) 0
)
batch_normalization (BatchN (None, 54, 54, 32) 128
ormalization)
conv2d_1 (Conv2D) (None, 50, 50, 64) 51264
max_pooling2d_1 (MaxPooling (None, 25, 25, 64) 0
2D)
batch_normalization_1 (Batc (None, 25, 25, 64) 256
hNormalization)
conv2d_2 (Conv2D) (None, 23, 23, 128) 73856
max_pooling2d_2 (MaxPooling (None, 11, 11, 128) 0
2D)
batch_normalization_2 (Batc (None, 11, 11, 128) 512
hNormalization)
flatten (Flatten) (None, 15488) 0
dense (Dense) (None, 128) 1982592
dense_1 (Dense) (None, 84) 10836
dense_2 (Dense) (None, 42) 3570
dense_3 (Dense) (None, 2) 86
=================================================================
Total params: 2,127,836
Trainable params: 2,127,388
Non-trainable params: 448
_________________________________________________________________
batch_size = 64
nb_epochs = 10
history = model_2.fit(X_train1, y_train1,
batch_size=batch_size,
epochs=nb_epochs,
validation_data=(X_val, y_val),
initial_epoch=0, callbacks=[EarlyStopping(monitor='val_loss', patience=2, min_delta=0.001)])
Epoch 1/10 60/60 [==============================] - 135s 2s/step - loss: 0.6248 - accuracy: 0.6591 - val_loss: 0.7425 - val_accuracy: 0.5365 Epoch 2/10 60/60 [==============================] - 134s 2s/step - loss: 0.5363 - accuracy: 0.7331 - val_loss: 0.6199 - val_accuracy: 0.6677 Epoch 3/10 60/60 [==============================] - 133s 2s/step - loss: 0.4941 - accuracy: 0.7635 - val_loss: 0.5708 - val_accuracy: 0.7115 Epoch 4/10 60/60 [==============================] - 134s 2s/step - loss: 0.4607 - accuracy: 0.7906 - val_loss: 0.5489 - val_accuracy: 0.7312 Epoch 5/10 60/60 [==============================] - 134s 2s/step - loss: 0.4323 - accuracy: 0.8130 - val_loss: 0.5407 - val_accuracy: 0.7333 Epoch 6/10 60/60 [==============================] - 135s 2s/step - loss: 0.4048 - accuracy: 0.8307 - val_loss: 0.5402 - val_accuracy: 0.7354 Epoch 7/10 60/60 [==============================] - 134s 2s/step - loss: 0.3792 - accuracy: 0.8474 - val_loss: 0.5410 - val_accuracy: 0.7396
optimizer = Adam(learning_rate=0.00001, beta_1=0.9, beta_2=0.999, epsilon=1e-07)
model_2a.compile(optimizer = optimizer , loss = "categorical_crossentropy", metrics=["accuracy"])
model_2a.summary()
Model: "sequential"
_________________________________________________________________
Layer (type) Output Shape Param #
=================================================================
conv2d (Conv2D) (None, 220, 220, 32) 2432
max_pooling2d (MaxPooling2D (None, 110, 110, 32) 0
)
batch_normalization (BatchN (None, 110, 110, 32) 128
ormalization)
conv2d_1 (Conv2D) (None, 106, 106, 64) 51264
max_pooling2d_1 (MaxPooling (None, 53, 53, 64) 0
2D)
batch_normalization_1 (Batc (None, 53, 53, 64) 256
hNormalization)
conv2d_2 (Conv2D) (None, 51, 51, 128) 73856
max_pooling2d_2 (MaxPooling (None, 25, 25, 128) 0
2D)
batch_normalization_2 (Batc (None, 25, 25, 128) 512
hNormalization)
dropout (Dropout) (None, 25, 25, 128) 0
conv2d_3 (Conv2D) (None, 23, 23, 256) 295168
max_pooling2d_3 (MaxPooling (None, 11, 11, 256) 0
2D)
batch_normalization_3 (Batc (None, 11, 11, 256) 1024
hNormalization)
dropout_1 (Dropout) (None, 11, 11, 256) 0
flatten (Flatten) (None, 30976) 0
dense (Dense) (None, 256) 7930112
dropout_2 (Dropout) (None, 256) 0
dense_1 (Dense) (None, 128) 32896
dense_2 (Dense) (None, 64) 8256
dense_3 (Dense) (None, 32) 2080
dense_4 (Dense) (None, 2) 66
=================================================================
Total params: 8,398,050
Trainable params: 8,397,090
Non-trainable params: 960
_________________________________________________________________
batch_size = 64
nb_epochs = 10
history = model_2a.fit(X_train1, y_train1,
batch_size=batch_size,
epochs=nb_epochs,
validation_data=(X_val, y_val),
initial_epoch=0, callbacks=[EarlyStopping(monitor='val_loss', patience=2, min_delta=0.001)])
Epoch 1/10 60/60 [==============================] - 486s 8s/step - loss: 0.6483 - accuracy: 0.6344 - val_loss: 0.6356 - val_accuracy: 0.6313 Epoch 2/10 60/60 [==============================] - 657s 11s/step - loss: 0.5740 - accuracy: 0.7201 - val_loss: 0.6000 - val_accuracy: 0.7219 Epoch 3/10 60/60 [==============================] - 660s 11s/step - loss: 0.5566 - accuracy: 0.7180 - val_loss: 0.5627 - val_accuracy: 0.7333 Epoch 4/10 60/60 [==============================] - 662s 11s/step - loss: 0.5311 - accuracy: 0.7398 - val_loss: 0.5487 - val_accuracy: 0.7281 Epoch 5/10 60/60 [==============================] - 664s 11s/step - loss: 0.5235 - accuracy: 0.7396 - val_loss: 0.5421 - val_accuracy: 0.7333 Epoch 6/10 60/60 [==============================] - 577s 10s/step - loss: 0.5029 - accuracy: 0.7560 - val_loss: 0.5343 - val_accuracy: 0.7448 Epoch 7/10 60/60 [==============================] - 470s 8s/step - loss: 0.4948 - accuracy: 0.7560 - val_loss: 0.5312 - val_accuracy: 0.7375 Epoch 8/10 60/60 [==============================] - 514s 9s/step - loss: 0.4863 - accuracy: 0.7633 - val_loss: 0.5305 - val_accuracy: 0.7417 Epoch 9/10 60/60 [==============================] - 514s 9s/step - loss: 0.4691 - accuracy: 0.7812 - val_loss: 0.5332 - val_accuracy: 0.7292
optimizer = Adam(learning_rate=0.00001, beta_1=0.9, beta_2=0.999, epsilon=1e-07)
model_3.compile(optimizer = optimizer , loss = "categorical_crossentropy", metrics=["accuracy"])
model_3.summary()
Model: "sequential"
_________________________________________________________________
Layer (type) Output Shape Param #
=================================================================
conv2d (Conv2D) (None, 220, 220, 32) 2432
max_pooling2d (MaxPooling2D (None, 110, 110, 32) 0
)
conv2d_1 (Conv2D) (None, 106, 106, 64) 51264
max_pooling2d_1 (MaxPooling (None, 53, 53, 64) 0
2D)
conv2d_2 (Conv2D) (None, 51, 51, 128) 73856
max_pooling2d_2 (MaxPooling (None, 25, 25, 128) 0
2D)
batch_normalization (BatchN (None, 25, 25, 128) 512
ormalization)
conv2d_3 (Conv2D) (None, 23, 23, 256) 295168
max_pooling2d_3 (MaxPooling (None, 11, 11, 256) 0
2D)
dropout (Dropout) (None, 11, 11, 256) 0
flatten (Flatten) (None, 30976) 0
dense (Dense) (None, 128) 3965056
dense_1 (Dense) (None, 84) 10836
dense_2 (Dense) (None, 42) 3570
dense_3 (Dense) (None, 2) 86
=================================================================
Total params: 4,402,780
Trainable params: 4,402,524
Non-trainable params: 256
_________________________________________________________________
batch_size = 64
nb_epochs = 10
history = model_3.fit(X_train1, y_train1,
batch_size=batch_size,
epochs=nb_epochs,
validation_data=(X_val, y_val),
initial_epoch=0, callbacks=[EarlyStopping(monitor='val_loss', patience=2, min_delta=0.001)])
Epoch 1/10 60/60 [==============================] - 471s 8s/step - loss: 0.8552 - accuracy: 0.5497 - val_loss: 0.7189 - val_accuracy: 0.5354 Epoch 2/10 60/60 [==============================] - 641s 11s/step - loss: 0.6591 - accuracy: 0.6362 - val_loss: 0.6445 - val_accuracy: 0.6385 Epoch 3/10 60/60 [==============================] - 638s 11s/step - loss: 0.6219 - accuracy: 0.6664 - val_loss: 0.5995 - val_accuracy: 0.6917 Epoch 4/10 60/60 [==============================] - 648s 11s/step - loss: 0.6093 - accuracy: 0.6836 - val_loss: 0.5757 - val_accuracy: 0.7115 Epoch 5/10 60/60 [==============================] - 640s 11s/step - loss: 0.5981 - accuracy: 0.6857 - val_loss: 0.5678 - val_accuracy: 0.6927 Epoch 6/10 60/60 [==============================] - 10691s 181s/step - loss: 0.5924 - accuracy: 0.6971 - val_loss: 0.5605 - val_accuracy: 0.6979 Epoch 7/10 60/60 [==============================] - 439s 7s/step - loss: 0.5731 - accuracy: 0.7094 - val_loss: 0.5578 - val_accuracy: 0.7125 Epoch 8/10 60/60 [==============================] - 450s 8s/step - loss: 0.5682 - accuracy: 0.7185 - val_loss: 0.5504 - val_accuracy: 0.7135 Epoch 9/10 60/60 [==============================] - 450s 8s/step - loss: 0.5679 - accuracy: 0.7164 - val_loss: 0.5503 - val_accuracy: 0.7208 Epoch 10/10 60/60 [==============================] - 487s 8s/step - loss: 0.5656 - accuracy: 0.7125 - val_loss: 0.5520 - val_accuracy: 0.7240
RESULTS = pd.DataFrame()
loss_1, accuracy_1 = model.evaluate(X_test,y_test)
RESULTS = RESULTS.append(pd.Series(["model",np.round((loss_1*100),2),np.round((accuracy_1*100),2)]),ignore_index=True)
loss_2, accuracy_2 = model_2.evaluate(X_test,y_test)
RESULTS = RESULTS.append(pd.Series(["model_2",np.round((loss_2*100),2),np.round((accuracy_2*100),2)]),ignore_index=True)
loss_2a, accuracy_2a = model_2a.evaluate(X_test,y_test)
RESULTS = RESULTS.append(pd.Series(["model_2a",np.round((loss_2a*100),2),np.round((accuracy_2a*100),2)]),ignore_index=True)
loss_3, accuracy_3 = model_3.evaluate(X_test,y_test)
RESULTS = RESULTS.append(pd.Series(["model_3",np.round((loss_3*100),2),np.round((accuracy_3*100),2)]),ignore_index=True)
RESULTS.columns = ['MODEL_NAME','LOSS','ACCURACY']
RESULTS.style.set_properties(**{'border': '1.3px solid black','color': 'blue'}).hide_index()
38/38 [==============================] - 7s 163ms/step - loss: 0.5908 - accuracy: 0.7142 38/38 [==============================] - 6s 159ms/step - loss: 0.5694 - accuracy: 0.7050 38/38 [==============================] - 25s 656ms/step - loss: 0.5557 - accuracy: 0.7183 38/38 [==============================] - 24s 625ms/step - loss: 0.5746 - accuracy: 0.6983
| MODEL_NAME | LOSS | ACCURACY |
|---|---|---|
| model | 59.080000 | 71.420000 |
| model_2 | 56.940000 | 70.500000 |
| model_2a | 55.570000 | 71.830000 |
| model_3 | 57.460000 | 69.830000 |
RESULTS.to_csv("Optimizer_and_loss_finetuning_1.csv",index = False)
Observations:
* Models are trained very slow due to learning rate very low.
* There is no huge bump in the model accuracy compared to base models.
* Here we reduce the number of images per batch from 64 to 16
* Dicom image contains huge data and when more images per batch are fed can incur in leaving out some meaningful data.
* By reducing the no. of images per batch, we expect model to train better on images fed and thus increase the performance.
optimizer = Adam(learning_rate=0.01)
model.compile(optimizer = optimizer , loss = "binary_crossentropy", metrics=["accuracy"])
model.summary()
Model: "sequential"
_________________________________________________________________
Layer (type) Output Shape Param #
=================================================================
conv2d (Conv2D) (None, 109, 109, 32) 4736
max_pooling2d (MaxPooling2D (None, 54, 54, 32) 0
)
batch_normalization (BatchN (None, 54, 54, 32) 128
ormalization)
conv2d_1 (Conv2D) (None, 50, 50, 64) 51264
batch_normalization_1 (Batc (None, 50, 50, 64) 256
hNormalization)
max_pooling2d_1 (MaxPooling (None, 25, 25, 64) 0
2D)
flatten (Flatten) (None, 40000) 0
dense (Dense) (None, 128) 5120128
dense_1 (Dense) (None, 64) 8256
dense_2 (Dense) (None, 2) 130
=================================================================
Total params: 5,184,898
Trainable params: 5,184,706
Non-trainable params: 192
_________________________________________________________________
batch_size = 16
nb_epochs = 10
history = model.fit(X_train1, y_train1,
batch_size=batch_size,
epochs=nb_epochs,
validation_data=(X_val, y_val),
initial_epoch=0, callbacks=[EarlyStopping(monitor='val_loss', patience=2, min_delta=0.001)])
Epoch 1/10 240/240 [==============================] - 126s 523ms/step - loss: 2.3675 - accuracy: 0.6521 - val_loss: 0.7864 - val_accuracy: 0.6979 Epoch 2/10 240/240 [==============================] - 124s 518ms/step - loss: 0.6270 - accuracy: 0.7078 - val_loss: 0.5919 - val_accuracy: 0.7208 Epoch 3/10 240/240 [==============================] - 124s 515ms/step - loss: 0.5769 - accuracy: 0.7193 - val_loss: 0.6178 - val_accuracy: 0.6854 Epoch 4/10 240/240 [==============================] - 122s 507ms/step - loss: 0.5692 - accuracy: 0.7224 - val_loss: 0.5901 - val_accuracy: 0.7250 Epoch 5/10 240/240 [==============================] - 122s 508ms/step - loss: 0.5425 - accuracy: 0.7297 - val_loss: 0.6170 - val_accuracy: 0.7000 Epoch 6/10 240/240 [==============================] - 123s 514ms/step - loss: 0.5504 - accuracy: 0.7211 - val_loss: 0.6445 - val_accuracy: 0.7073
optimizer = Adam(learning_rate=0.01)
model_2.compile(optimizer = optimizer , loss = "binary_crossentropy", metrics=["accuracy"])
model_2.summary()
Model: "sequential"
_________________________________________________________________
Layer (type) Output Shape Param #
=================================================================
conv2d (Conv2D) (None, 109, 109, 32) 4736
max_pooling2d (MaxPooling2D (None, 54, 54, 32) 0
)
batch_normalization (BatchN (None, 54, 54, 32) 128
ormalization)
conv2d_1 (Conv2D) (None, 50, 50, 64) 51264
max_pooling2d_1 (MaxPooling (None, 25, 25, 64) 0
2D)
batch_normalization_1 (Batc (None, 25, 25, 64) 256
hNormalization)
conv2d_2 (Conv2D) (None, 23, 23, 128) 73856
max_pooling2d_2 (MaxPooling (None, 11, 11, 128) 0
2D)
batch_normalization_2 (Batc (None, 11, 11, 128) 512
hNormalization)
flatten (Flatten) (None, 15488) 0
dense (Dense) (None, 128) 1982592
dense_1 (Dense) (None, 84) 10836
dense_2 (Dense) (None, 42) 3570
dense_3 (Dense) (None, 2) 86
=================================================================
Total params: 2,127,836
Trainable params: 2,127,388
Non-trainable params: 448
_________________________________________________________________
batch_size = 16
nb_epochs = 10
history = model_2.fit(X_train1, y_train1,
batch_size=batch_size,
epochs=nb_epochs,
validation_data=(X_val, y_val),
initial_epoch=0, callbacks=[EarlyStopping(monitor='val_loss', patience=2, min_delta=0.001)])
Epoch 1/10 240/240 [==============================] - 120s 496ms/step - loss: 0.9150 - accuracy: 0.5143 - val_loss: 0.6935 - val_accuracy: 0.5000 Epoch 2/10 240/240 [==============================] - 116s 485ms/step - loss: 0.6941 - accuracy: 0.5042 - val_loss: 0.6933 - val_accuracy: 0.5000 Epoch 3/10 240/240 [==============================] - 115s 479ms/step - loss: 0.6934 - accuracy: 0.4930 - val_loss: 0.6931 - val_accuracy: 0.5000
optimizer = Adam(learning_rate=0.01)
model_2a.compile(optimizer = optimizer , loss = "binary_crossentropy", metrics=["accuracy"])
model_2a.summary()
Model: "sequential"
_________________________________________________________________
Layer (type) Output Shape Param #
=================================================================
conv2d (Conv2D) (None, 220, 220, 32) 2432
max_pooling2d (MaxPooling2D (None, 110, 110, 32) 0
)
batch_normalization (BatchN (None, 110, 110, 32) 128
ormalization)
conv2d_1 (Conv2D) (None, 106, 106, 64) 51264
max_pooling2d_1 (MaxPooling (None, 53, 53, 64) 0
2D)
batch_normalization_1 (Batc (None, 53, 53, 64) 256
hNormalization)
conv2d_2 (Conv2D) (None, 51, 51, 128) 73856
max_pooling2d_2 (MaxPooling (None, 25, 25, 128) 0
2D)
batch_normalization_2 (Batc (None, 25, 25, 128) 512
hNormalization)
dropout (Dropout) (None, 25, 25, 128) 0
conv2d_3 (Conv2D) (None, 23, 23, 256) 295168
max_pooling2d_3 (MaxPooling (None, 11, 11, 256) 0
2D)
batch_normalization_3 (Batc (None, 11, 11, 256) 1024
hNormalization)
dropout_1 (Dropout) (None, 11, 11, 256) 0
flatten (Flatten) (None, 30976) 0
dense (Dense) (None, 256) 7930112
dropout_2 (Dropout) (None, 256) 0
dense_1 (Dense) (None, 128) 32896
dense_2 (Dense) (None, 64) 8256
dense_3 (Dense) (None, 32) 2080
dense_4 (Dense) (None, 2) 66
=================================================================
Total params: 8,398,050
Trainable params: 8,397,090
Non-trainable params: 960
_________________________________________________________________
batch_size = 16
nb_epochs = 10
history = model_2a.fit(X_train1, y_train1,
batch_size=batch_size,
epochs=nb_epochs,
validation_data=(X_val, y_val),
initial_epoch=0, callbacks=[EarlyStopping(monitor='val_loss', patience=2, min_delta=0.001)])
Epoch 1/10 240/240 [==============================] - 443s 2s/step - loss: 0.9402 - accuracy: 0.6276 - val_loss: 0.5802 - val_accuracy: 0.6969 Epoch 2/10 240/240 [==============================] - 443s 2s/step - loss: 0.6269 - accuracy: 0.6826 - val_loss: 0.6670 - val_accuracy: 0.5906 Epoch 3/10 240/240 [==============================] - 441s 2s/step - loss: 0.6524 - accuracy: 0.6266 - val_loss: 0.6493 - val_accuracy: 0.5000
optimizer = Adam(learning_rate=0.01)
model_3.compile(optimizer = optimizer , loss = "binary_crossentropy", metrics=["accuracy"])
model_3.summary()
Model: "sequential"
_________________________________________________________________
Layer (type) Output Shape Param #
=================================================================
conv2d (Conv2D) (None, 220, 220, 32) 2432
max_pooling2d (MaxPooling2D (None, 110, 110, 32) 0
)
conv2d_1 (Conv2D) (None, 106, 106, 64) 51264
max_pooling2d_1 (MaxPooling (None, 53, 53, 64) 0
2D)
conv2d_2 (Conv2D) (None, 51, 51, 128) 73856
max_pooling2d_2 (MaxPooling (None, 25, 25, 128) 0
2D)
batch_normalization (BatchN (None, 25, 25, 128) 512
ormalization)
conv2d_3 (Conv2D) (None, 23, 23, 256) 295168
max_pooling2d_3 (MaxPooling (None, 11, 11, 256) 0
2D)
dropout (Dropout) (None, 11, 11, 256) 0
flatten (Flatten) (None, 30976) 0
dense (Dense) (None, 128) 3965056
dense_1 (Dense) (None, 84) 10836
dense_2 (Dense) (None, 42) 3570
dense_3 (Dense) (None, 2) 86
=================================================================
Total params: 4,402,780
Trainable params: 4,402,524
Non-trainable params: 256
_________________________________________________________________
batch_size = 16
nb_epochs = 10
history = model_3.fit(X_train1, y_train1,
batch_size=batch_size,
epochs=nb_epochs,
validation_data=(X_val, y_val),
initial_epoch=0, callbacks=[EarlyStopping(monitor='val_loss', patience=2, min_delta=0.001)])
Epoch 1/10 240/240 [==============================] - 417s 2s/step - loss: 0.9298 - accuracy: 0.5000 - val_loss: 0.6933 - val_accuracy: 0.5000 Epoch 2/10 240/240 [==============================] - 406s 2s/step - loss: 0.6936 - accuracy: 0.4844 - val_loss: 0.6932 - val_accuracy: 0.5000 Epoch 3/10 240/240 [==============================] - 405s 2s/step - loss: 0.6935 - accuracy: 0.4969 - val_loss: 0.6933 - val_accuracy: 0.5000
RESULTS = pd.DataFrame()
loss_1, accuracy_1 = model.evaluate(X_test,y_test)
RESULTS = RESULTS.append(pd.Series(["model",np.round((loss_1*100),2),np.round((accuracy_1*100),2)]),ignore_index=True)
loss_2, accuracy_2 = model_2.evaluate(X_test,y_test)
RESULTS = RESULTS.append(pd.Series(["model_2",np.round((loss_2*100),2),np.round((accuracy_2*100),2)]),ignore_index=True)
loss_2a, accuracy_2a = model_2a.evaluate(X_test,y_test)
RESULTS = RESULTS.append(pd.Series(["model_2a",np.round((loss_2a*100),2),np.round((accuracy_2a*100),2)]),ignore_index=True)
loss_3, accuracy_3 = model_3.evaluate(X_test,y_test)
RESULTS = RESULTS.append(pd.Series(["model_3",np.round((loss_3*100),2),np.round((accuracy_3*100),2)]),ignore_index=True)
RESULTS.columns = ['MODEL_NAME','LOSS','ACCURACY']
RESULTS.style.set_properties(**{'border': '1.3px solid black','color': 'blue'}).hide_index()
38/38 [==============================] - 6s 150ms/step - loss: 0.6382 - accuracy: 0.6917 38/38 [==============================] - 6s 141ms/step - loss: 0.6931 - accuracy: 0.5000 38/38 [==============================] - 22s 572ms/step - loss: 0.6493 - accuracy: 0.5000 38/38 [==============================] - 21s 553ms/step - loss: 0.6933 - accuracy: 0.5000
| MODEL_NAME | LOSS | ACCURACY |
|---|---|---|
| model | 63.820000 | 69.170000 |
| model_2 | 69.310000 | 50.000000 |
| model_2a | 64.930000 | 50.000000 |
| model_3 | 69.330000 | 50.000000 |
RESULTS.to_csv("Reduce_Batch_Size.csv",index = False)
Observations:
* As the images per batch is reduced, accuracy takes a huge bump backwards and reduces to 50%
* This behaviour is expected because, model failed to understand the patterns as there less images per batch.
* We will increase the number of images per batch from 64 to 120
* Giving more images per batch might lead to model understand more number of patterns in a single batch and train better
batch_size = 120
nb_epochs = 10
history = model.fit(X_train1, y_train1,
batch_size=batch_size,
epochs=nb_epochs,
validation_data=(X_val, y_val),
initial_epoch=0, callbacks=[EarlyStopping(monitor='val_loss', patience=2, min_delta=0.001)])
Epoch 1/10 32/32 [==============================] - 106s 3s/step - loss: 0.4911 - accuracy: 0.7555 - val_loss: 0.5871 - val_accuracy: 0.7240 Epoch 2/10 32/32 [==============================] - 103s 3s/step - loss: 0.4547 - accuracy: 0.7755 - val_loss: 0.6005 - val_accuracy: 0.7250 Epoch 3/10 32/32 [==============================] - 104s 3s/step - loss: 0.4403 - accuracy: 0.7823 - val_loss: 0.6144 - val_accuracy: 0.7208
batch_size = 120
nb_epochs = 10
history = model_2.fit(X_train1, y_train1,
batch_size=batch_size,
epochs=nb_epochs,
validation_data=(X_val, y_val),
initial_epoch=0, callbacks=[EarlyStopping(monitor='val_loss', patience=2, min_delta=0.001)])
Epoch 1/10 32/32 [==============================] - 110s 3s/step - loss: 0.6930 - accuracy: 0.5000 - val_loss: 0.6931 - val_accuracy: 0.5000 Epoch 2/10 32/32 [==============================] - 110s 3s/step - loss: 0.6930 - accuracy: 0.5000 - val_loss: 0.6931 - val_accuracy: 0.5000 Epoch 3/10 32/32 [==============================] - 110s 3s/step - loss: 0.6930 - accuracy: 0.5000 - val_loss: 0.6932 - val_accuracy: 0.5000
batch_size = 120
nb_epochs = 10
history = model_2a.fit(X_train1, y_train1,
batch_size=batch_size,
epochs=nb_epochs,
validation_data=(X_val, y_val),
initial_epoch=0, callbacks=[EarlyStopping(monitor='val_loss', patience=2, min_delta=0.001)])
Epoch 1/10 32/32 [==============================] - 445s 14s/step - loss: 0.6361 - accuracy: 0.6518 - val_loss: 0.6157 - val_accuracy: 0.7073 Epoch 2/10 32/32 [==============================] - 466s 15s/step - loss: 0.6189 - accuracy: 0.7026 - val_loss: 0.5929 - val_accuracy: 0.7167 Epoch 3/10 32/32 [==============================] - 463s 14s/step - loss: 0.5974 - accuracy: 0.7227 - val_loss: 0.5850 - val_accuracy: 0.7250 Epoch 4/10 32/32 [==============================] - 485s 15s/step - loss: 0.5821 - accuracy: 0.7279 - val_loss: 0.5795 - val_accuracy: 0.7302 Epoch 5/10 32/32 [==============================] - 11978s 20s/step - loss: 0.5732 - accuracy: 0.7279 - val_loss: 0.5754 - val_accuracy: 0.7240 Epoch 6/10 32/32 [==============================] - 583s 18s/step - loss: 0.5696 - accuracy: 0.7266 - val_loss: 0.5779 - val_accuracy: 0.7135 Epoch 7/10 32/32 [==============================] - 584s 18s/step - loss: 0.5621 - accuracy: 0.7388 - val_loss: 0.5658 - val_accuracy: 0.7302 Epoch 8/10 32/32 [==============================] - 520s 15s/step - loss: 0.5561 - accuracy: 0.7362 - val_loss: 0.5506 - val_accuracy: 0.7437 Epoch 9/10 32/32 [==============================] - 403s 13s/step - loss: 0.5498 - accuracy: 0.7375 - val_loss: 0.5499 - val_accuracy: 0.7333 Epoch 10/10 32/32 [==============================] - 403s 13s/step - loss: 0.5399 - accuracy: 0.7495 - val_loss: 0.5484 - val_accuracy: 0.7365
batch_size = 120
nb_epochs = 10
history = model_3.fit(X_train1, y_train1,
batch_size=batch_size,
epochs=nb_epochs,
validation_data=(X_val, y_val),
initial_epoch=0, callbacks=[EarlyStopping(monitor='val_loss', patience=2, min_delta=0.001)])
Epoch 1/10 32/32 [==============================] - 387s 12s/step - loss: 0.6933 - accuracy: 0.5000 - val_loss: 0.6932 - val_accuracy: 0.5000 Epoch 2/10 32/32 [==============================] - 391s 12s/step - loss: 0.6932 - accuracy: 0.5000 - val_loss: 0.6932 - val_accuracy: 0.5000 Epoch 3/10 32/32 [==============================] - 393s 12s/step - loss: 0.6934 - accuracy: 0.4839 - val_loss: 0.6932 - val_accuracy: 0.5000
RESULTS = pd.DataFrame()
loss_1, accuracy_1 = model.evaluate(X_test,y_test)
RESULTS = RESULTS.append(pd.Series(["model",np.round((loss_1*100),2),np.round((accuracy_1*100),2)]),ignore_index=True)
loss_2, accuracy_2 = model_2.evaluate(X_test,y_test)
RESULTS = RESULTS.append(pd.Series(["model_2",np.round((loss_2*100),2),np.round((accuracy_2*100),2)]),ignore_index=True)
loss_2a, accuracy_2a = model_2a.evaluate(X_test,y_test)
RESULTS = RESULTS.append(pd.Series(["model_2a",np.round((loss_2a*100),2),np.round((accuracy_2a*100),2)]),ignore_index=True)
loss_3, accuracy_3 = model_3.evaluate(X_test,y_test)
RESULTS = RESULTS.append(pd.Series(["model_3",np.round((loss_3*100),2),np.round((accuracy_3*100),2)]),ignore_index=True)
RESULTS.columns = ['MODEL_NAME','LOSS','ACCURACY']
RESULTS.style.set_properties(**{'border': '1.3px solid black','color': 'blue'}).hide_index()
38/38 [==============================] - 35s 910ms/step - loss: 0.6109 - accuracy: 0.7150 38/38 [==============================] - 25s 658ms/step - loss: 0.6932 - accuracy: 0.5000 38/38 [==============================] - 40s 1s/step - loss: 0.5626 - accuracy: 0.7158 38/38 [==============================] - 34s 905ms/step - loss: 0.6932 - accuracy: 0.5000
| MODEL_NAME | LOSS | ACCURACY |
|---|---|---|
| model | 61.090000 | 71.500000 |
| model_2 | 69.320000 | 50.000000 |
| model_2a | 56.260000 | 71.580000 |
| model_3 | 69.320000 | 50.000000 |
RESULTS.to_csv("Increase_Batch_Size.csv",index = False)
* Before this step, we have only tuned based on size of data, number of images, type of images, optimizers and loss function.
* Now we are going to try modifying the model layers in the basemodels and try to improve the model performance.
tf.keras.backend.clear_session()
# Initialize the model
model = Sequential()
# Add a Convolutional Layer with 32 filters of size 7X7, strides(2,2) and activation function as 'relu'
model.add(Conv2D(filters=32, kernel_size=3, strides=(1, 1), padding='same', kernel_regularizer = tf.keras.regularizers.l2(0.01), activity_regularizer = tf.keras.regularizers.l1(0.01), activation="relu", input_shape=(224, 224, 3)))
model.add(MaxPooling2D(pool_size=(2, 2)))
model.add(BatchNormalization())
# model.add(Dropout(0.3))
# Add a Convolutional Layer with 64 filters of size 5X5, strides(2,2) and activation function as 'relu'
model.add(Conv2D(filters=64, kernel_size=3, strides=(1, 1), padding='same', kernel_regularizer = tf.keras.regularizers.l2(0.01), activity_regularizer = tf.keras.regularizers.l1(0.01), activation="relu"))
model.add(MaxPooling2D(pool_size=(2, 2)))
model.add(BatchNormalization())
model.add(Dropout(0.3))
# Flatten the layer
model.add(Flatten())
# Add Fully Connected Layer with 128 units and activation function as 'relu'
model.add(Dense(128, activation="relu"))
model.add(Dense(64, activation="relu"))
#Add Fully Connected Layer with 3 units and activation function as 'softmax'
model.add(Dense(2, activation="sigmoid"))
optimizer = Adam(learning_rate=0.00001, beta_1=0.9, beta_2=0.999, epsilon=1e-07)
model.compile(optimizer = optimizer , loss = "binary_crossentropy", metrics=["accuracy"])
model.summary()
Model: "sequential"
_________________________________________________________________
Layer (type) Output Shape Param #
=================================================================
conv2d (Conv2D) (None, 224, 224, 32) 896
max_pooling2d (MaxPooling2D (None, 112, 112, 32) 0
)
batch_normalization (BatchN (None, 112, 112, 32) 128
ormalization)
conv2d_1 (Conv2D) (None, 112, 112, 64) 18496
max_pooling2d_1 (MaxPooling (None, 56, 56, 64) 0
2D)
batch_normalization_1 (Batc (None, 56, 56, 64) 256
hNormalization)
dropout (Dropout) (None, 56, 56, 64) 0
flatten (Flatten) (None, 200704) 0
dense (Dense) (None, 128) 25690240
dense_1 (Dense) (None, 64) 8256
dense_2 (Dense) (None, 2) 130
=================================================================
Total params: 25,718,402
Trainable params: 25,718,210
Non-trainable params: 192
_________________________________________________________________
batch_size = 64
nb_epochs = 10
history = model.fit(X_train1, y_train1,
batch_size=batch_size,
epochs=nb_epochs,
validation_data=(X_val, y_val),
initial_epoch=0, callbacks=[EarlyStopping(monitor='val_loss', patience=2, min_delta=0.001)])
Epoch 1/10 60/60 [==============================] - 258s 4s/step - loss: 373833.3125 - accuracy: 0.6779 - val_loss: 367506.0000 - val_accuracy: 0.7052 Epoch 2/10 60/60 [==============================] - 248s 4s/step - loss: 359018.6562 - accuracy: 0.7563 - val_loss: 352139.3750 - val_accuracy: 0.7198 Epoch 3/10 60/60 [==============================] - 247s 4s/step - loss: 344620.0000 - accuracy: 0.8026 - val_loss: 337732.8438 - val_accuracy: 0.7312 Epoch 4/10 60/60 [==============================] - 284s 5s/step - loss: 330597.5625 - accuracy: 0.8349 - val_loss: 323821.8750 - val_accuracy: 0.7010 Epoch 5/10 60/60 [==============================] - 289s 5s/step - loss: 316935.0000 - accuracy: 0.8841 - val_loss: 310343.5938 - val_accuracy: 0.7208 Epoch 6/10 60/60 [==============================] - 284s 5s/step - loss: 303653.0000 - accuracy: 0.9201 - val_loss: 297248.7188 - val_accuracy: 0.7260 Epoch 7/10 60/60 [==============================] - 286s 5s/step - loss: 290755.3438 - accuracy: 0.9451 - val_loss: 284523.7812 - val_accuracy: 0.7385 Epoch 8/10 60/60 [==============================] - 286s 5s/step - loss: 278221.6250 - accuracy: 0.9677 - val_loss: 272167.8750 - val_accuracy: 0.7063 Epoch 9/10 60/60 [==============================] - 287s 5s/step - loss: 266036.9375 - accuracy: 0.9828 - val_loss: 260131.4688 - val_accuracy: 0.7094 Epoch 10/10 60/60 [==============================] - 288s 5s/step - loss: 254169.2812 - accuracy: 0.9909 - val_loss: 248407.6406 - val_accuracy: 0.7104
tf.keras.backend.clear_session()
# second model
model_2 = Sequential()
model_2.add(Conv2D(filters=32, kernel_size=7, strides=(2, 2), padding='same', kernel_regularizer = tf.keras.regularizers.l2(0.01), activity_regularizer = tf.keras.regularizers.l1(0.01), activation="relu", input_shape=(224, 224, 3)))
model_2.add(MaxPooling2D(pool_size=(2, 2)))
model_2.add(BatchNormalization())
model_2.add(Conv2D(filters=64, kernel_size=5, strides=(1, 1), padding='same', kernel_regularizer = tf.keras.regularizers.l2(0.01), activity_regularizer = tf.keras.regularizers.l1(0.01), activation="relu"))
model_2.add(MaxPooling2D(pool_size=(2, 2)))
model_2.add(BatchNormalization())
model_2.add(Dropout(rate=0.3))
model_2.add(Conv2D(filters=128, kernel_size=3, strides=(1, 1), padding='same', kernel_regularizer = tf.keras.regularizers.l2(0.01), activity_regularizer = tf.keras.regularizers.l1(0.01), activation="relu"))
model_2.add(MaxPooling2D(pool_size=(2, 2)))
model_2.add(BatchNormalization())
# model_2.add(Dropout(rate=0.7))
model_2.add(Flatten())
model_2.add(Dense(128, kernel_initializer='he_normal', activation="relu"))
# model_2.add(BatchNormalization())
# model_2.add(Dropout(rate=0.5))
model_2.add(Dense(84, kernel_initializer='he_normal', activation="relu"))
# model_2.add(BatchNormalization())
model_2.add(Dense(42, kernel_initializer='he_normal', activation="relu"))
model_2.add(Dense(2, activation="sigmoid"))
optimizer = Adam(learning_rate=0.0001, beta_1=0.9, beta_2=0.999, epsilon=1e-07)
model_2.compile(optimizer = optimizer , loss = "binary_crossentropy", metrics=["accuracy"])
model_2.summary()
Model: "sequential"
_________________________________________________________________
Layer (type) Output Shape Param #
=================================================================
conv2d (Conv2D) (None, 112, 112, 32) 4736
max_pooling2d (MaxPooling2D (None, 56, 56, 32) 0
)
batch_normalization (BatchN (None, 56, 56, 32) 128
ormalization)
conv2d_1 (Conv2D) (None, 56, 56, 64) 51264
max_pooling2d_1 (MaxPooling (None, 28, 28, 64) 0
2D)
batch_normalization_1 (Batc (None, 28, 28, 64) 256
hNormalization)
dropout (Dropout) (None, 28, 28, 64) 0
conv2d_2 (Conv2D) (None, 28, 28, 128) 73856
max_pooling2d_2 (MaxPooling (None, 14, 14, 128) 0
2D)
batch_normalization_2 (Batc (None, 14, 14, 128) 512
hNormalization)
flatten (Flatten) (None, 25088) 0
dense (Dense) (None, 128) 3211392
dense_1 (Dense) (None, 84) 10836
dense_2 (Dense) (None, 42) 3570
dense_3 (Dense) (None, 2) 86
=================================================================
Total params: 3,356,636
Trainable params: 3,356,188
Non-trainable params: 448
_________________________________________________________________
batch_size = 64
nb_epochs = 10
history_2 = model_2.fit(X_train1, y_train1,
batch_size=batch_size,
epochs=nb_epochs,
validation_data=(X_val, y_val),
initial_epoch=0, callbacks=[EarlyStopping(monitor='val_loss', patience=2, min_delta=0.001)])
Epoch 1/10 60/60 [==============================] - 108s 2s/step - loss: 23017.2090 - accuracy: 0.6469 - val_loss: 2748.1780 - val_accuracy: 0.6313 Epoch 2/10 60/60 [==============================] - 108s 2s/step - loss: 1179.1445 - accuracy: 0.6654 - val_loss: 747.2348 - val_accuracy: 0.5844 Epoch 3/10 60/60 [==============================] - 126s 2s/step - loss: 612.2598 - accuracy: 0.6789 - val_loss: 622.0334 - val_accuracy: 0.5135 Epoch 4/10 60/60 [==============================] - 123s 2s/step - loss: 437.5329 - accuracy: 0.6924 - val_loss: 527.3038 - val_accuracy: 0.4990 Epoch 5/10 60/60 [==============================] - 123s 2s/step - loss: 313.8621 - accuracy: 0.6844 - val_loss: 431.1918 - val_accuracy: 0.5219 Epoch 6/10 60/60 [==============================] - 123s 2s/step - loss: 236.2792 - accuracy: 0.6909 - val_loss: 341.1299 - val_accuracy: 0.5000 Epoch 7/10 60/60 [==============================] - 123s 2s/step - loss: 189.0528 - accuracy: 0.6961 - val_loss: 259.9745 - val_accuracy: 0.5000 Epoch 8/10 60/60 [==============================] - 123s 2s/step - loss: 157.5050 - accuracy: 0.7008 - val_loss: 186.4092 - val_accuracy: 0.5000 Epoch 9/10 60/60 [==============================] - 123s 2s/step - loss: 135.0659 - accuracy: 0.7016 - val_loss: 132.1451 - val_accuracy: 0.5010 Epoch 10/10 60/60 [==============================] - 123s 2s/step - loss: 117.9370 - accuracy: 0.7104 - val_loss: 112.4869 - val_accuracy: 0.5094
tf.keras.backend.clear_session()
# second model
model_2a = Sequential()
model_2a.add(Conv2D(filters=32, kernel_size=5, strides=(1, 1), kernel_regularizer = tf.keras.regularizers.l2(0.01), activity_regularizer = tf.keras.regularizers.l1(0.01), activation="relu", input_shape=(224, 224, 3)))
model_2a.add(MaxPooling2D(pool_size=(2, 2)))
model_2a.add(BatchNormalization())
model_2a.add(Conv2D(filters=64, kernel_size=5, strides=(1, 1), kernel_regularizer = tf.keras.regularizers.l2(0.01), activity_regularizer = tf.keras.regularizers.l1(0.01), activation="relu"))
model_2a.add(MaxPooling2D(pool_size=(2, 2)))
model_2a.add(BatchNormalization())
model_2a.add(Conv2D(filters=128, kernel_size=3, strides=(1, 1), kernel_regularizer = tf.keras.regularizers.l2(0.01), activity_regularizer = tf.keras.regularizers.l1(0.01), activation="relu"))
model_2a.add(MaxPooling2D(pool_size=(2, 2)))
model_2a.add(BatchNormalization())
model_2a.add(Dropout(rate=0.2))
model_2a.add(Conv2D(filters=256, kernel_size=3, strides=(1, 1), activation="relu"))
model_2a.add(MaxPooling2D(pool_size=(2, 2)))
model_2a.add(BatchNormalization())
model_2a.add(Dropout(rate=0.2))
model_2a.add(Flatten())
model_2a.add(Dense(256, kernel_initializer='he_normal', activation="relu"))
model_2a.add(Dropout(rate=0.25))
model_2a.add(Dense(128, kernel_initializer='he_normal', activation="relu"))
model_2a.add(Dense(64, kernel_initializer='he_normal', activation="relu"))
model_2a.add(Dense(32, kernel_initializer='he_normal', activation="relu"))
model_2a.add(Dense(2, activation="sigmoid"))
optimizer = Adam(learning_rate=0.00001, beta_1=0.9, beta_2=0.999, epsilon=1e-07)
model_2a.compile(optimizer = optimizer , loss = "binary_crossentropy", metrics=["accuracy"])
model_2a.summary()
Model: "sequential"
_________________________________________________________________
Layer (type) Output Shape Param #
=================================================================
conv2d (Conv2D) (None, 220, 220, 32) 2432
max_pooling2d (MaxPooling2D (None, 110, 110, 32) 0
)
batch_normalization (BatchN (None, 110, 110, 32) 128
ormalization)
conv2d_1 (Conv2D) (None, 106, 106, 64) 51264
max_pooling2d_1 (MaxPooling (None, 53, 53, 64) 0
2D)
batch_normalization_1 (Batc (None, 53, 53, 64) 256
hNormalization)
conv2d_2 (Conv2D) (None, 51, 51, 128) 73856
max_pooling2d_2 (MaxPooling (None, 25, 25, 128) 0
2D)
batch_normalization_2 (Batc (None, 25, 25, 128) 512
hNormalization)
dropout (Dropout) (None, 25, 25, 128) 0
conv2d_3 (Conv2D) (None, 23, 23, 256) 295168
max_pooling2d_3 (MaxPooling (None, 11, 11, 256) 0
2D)
batch_normalization_3 (Batc (None, 11, 11, 256) 1024
hNormalization)
dropout_1 (Dropout) (None, 11, 11, 256) 0
flatten (Flatten) (None, 30976) 0
dense (Dense) (None, 256) 7930112
dropout_2 (Dropout) (None, 256) 0
dense_1 (Dense) (None, 128) 32896
dense_2 (Dense) (None, 64) 8256
dense_3 (Dense) (None, 32) 2080
dense_4 (Dense) (None, 2) 66
=================================================================
Total params: 8,398,050
Trainable params: 8,397,090
Non-trainable params: 960
_________________________________________________________________
batch_size = 64
nb_epochs = 10
history_2a = model_2a.fit(X_train1, y_train1,
batch_size=batch_size,
epochs=nb_epochs,
validation_data=(X_val, y_val),
initial_epoch=0, callbacks=[EarlyStopping(monitor='val_loss', patience=2, min_delta=0.001)])
Epoch 1/10 60/60 [==============================] - 365s 6s/step - loss: 313694.4062 - accuracy: 0.6214 - val_loss: 296763.9062 - val_accuracy: 0.5875 Epoch 2/10 60/60 [==============================] - 360s 6s/step - loss: 277786.9688 - accuracy: 0.6849 - val_loss: 260931.8281 - val_accuracy: 0.7052 Epoch 3/10 60/60 [==============================] - 351s 6s/step - loss: 244889.1719 - accuracy: 0.6930 - val_loss: 229189.9531 - val_accuracy: 0.7229 Epoch 4/10 60/60 [==============================] - 397s 7s/step - loss: 214784.2812 - accuracy: 0.7044 - val_loss: 200388.6719 - val_accuracy: 0.6802 Epoch 5/10 60/60 [==============================] - 395s 7s/step - loss: 187482.0625 - accuracy: 0.7083 - val_loss: 174608.1875 - val_accuracy: 0.6812 Epoch 6/10 60/60 [==============================] - 393s 7s/step - loss: 163200.0625 - accuracy: 0.7107 - val_loss: 151705.8906 - val_accuracy: 0.6927 Epoch 7/10 60/60 [==============================] - 394s 7s/step - loss: 141574.5781 - accuracy: 0.7141 - val_loss: 131268.6719 - val_accuracy: 0.7156 Epoch 8/10 60/60 [==============================] - 393s 7s/step - loss: 122353.3906 - accuracy: 0.7148 - val_loss: 113170.2734 - val_accuracy: 0.7052 Epoch 9/10 60/60 [==============================] - 394s 7s/step - loss: 105394.9766 - accuracy: 0.7240 - val_loss: 97349.0547 - val_accuracy: 0.7125 Epoch 10/10 60/60 [==============================] - 394s 7s/step - loss: 90617.5547 - accuracy: 0.7250 - val_loss: 83556.2266 - val_accuracy: 0.7073
tf.keras.backend.clear_session()
# second model
model_3 = Sequential()
model_3.add(Conv2D(filters=32, kernel_size=3, strides=(1, 1), padding='same', kernel_initializer='he_normal', activation="relu", input_shape=(224, 224, 3)))
model_3.add(MaxPooling2D(pool_size=(2, 2)))
model_3.add(BatchNormalization())
model_3.add(Conv2D(filters=64, kernel_size=3, strides=(1, 1), padding='same', kernel_initializer='he_normal', activation="relu"))
model_3.add(MaxPooling2D(pool_size=(2, 2)))
model_3.add(BatchNormalization())
model_3.add(Dropout(rate=0.35))
model_3.add(Conv2D(filters=128, kernel_size=3, strides=(1, 1), padding='same', kernel_initializer='he_normal', activation="relu"))
model_3.add(MaxPooling2D(pool_size=(2, 2)))
model_3.add(BatchNormalization())
model_3.add(Conv2D(filters=256, kernel_size=3, strides=(1, 1), padding='same', kernel_initializer='he_normal', activation="relu"))
model_3.add(MaxPooling2D(pool_size=(2, 2)))
model_3.add(BatchNormalization())
# model_3.add(Dropout(rate=0.35))
model_3.add(Flatten())
# model_3.add(Dense(624, kernel_initializer='he_normal', kernel_regularizer = tf.keras.regularizers.l2(0.01), activity_regularizer = tf.keras.regularizers.l1(0.01), activation="relu"))
model_3.add(Dense(824, kernel_initializer='he_normal', activation="relu"))
# model_3.add(Dense(512, kernel_initializer='he_normal', kernel_regularizer = tf.keras.regularizers.l2(0.01), activity_regularizer = tf.keras.regularizers.l1(0.01), activation="relu"))
# model_3.add(Dense(512, kernel_initializer='he_normal', activation="relu"))
# model_3.add(Dense(300, kernel_initializer='he_normal', activation="relu"))
model_3.add(Dense(84, kernel_initializer='he_normal', activation="relu"))
model_3.add(Dense(42, kernel_initializer='he_normal', activation="relu"))
model_3.add(Dropout(rate=0.35))
model_3.add(Dense(2, activation="sigmoid"))
optimizer = Adam(learning_rate=0.00001, beta_1=0.9, beta_2=0.999, epsilon=1e-07)
model_3.compile(optimizer = optimizer , loss = "binary_crossentropy", metrics=["accuracy"])
model_3.summary()
Model: "sequential"
_________________________________________________________________
Layer (type) Output Shape Param #
=================================================================
conv2d (Conv2D) (None, 224, 224, 32) 896
max_pooling2d (MaxPooling2D (None, 112, 112, 32) 0
)
batch_normalization (BatchN (None, 112, 112, 32) 128
ormalization)
conv2d_1 (Conv2D) (None, 112, 112, 64) 18496
max_pooling2d_1 (MaxPooling (None, 56, 56, 64) 0
2D)
batch_normalization_1 (Batc (None, 56, 56, 64) 256
hNormalization)
dropout (Dropout) (None, 56, 56, 64) 0
conv2d_2 (Conv2D) (None, 56, 56, 128) 73856
max_pooling2d_2 (MaxPooling (None, 28, 28, 128) 0
2D)
batch_normalization_2 (Batc (None, 28, 28, 128) 512
hNormalization)
conv2d_3 (Conv2D) (None, 28, 28, 256) 295168
max_pooling2d_3 (MaxPooling (None, 14, 14, 256) 0
2D)
batch_normalization_3 (Batc (None, 14, 14, 256) 1024
hNormalization)
flatten (Flatten) (None, 50176) 0
dense (Dense) (None, 824) 41345848
dense_1 (Dense) (None, 84) 69300
dense_2 (Dense) (None, 42) 3570
dropout_1 (Dropout) (None, 42) 0
dense_3 (Dense) (None, 2) 86
=================================================================
Total params: 41,809,140
Trainable params: 41,808,180
Non-trainable params: 960
_________________________________________________________________
batch_size = 64
nb_epochs = 10
history_3 = model_3.fit(X_train1, y_train1,
batch_size=batch_size,
epochs=nb_epochs,
validation_data=(X_val, y_val),
initial_epoch=0, callbacks=[EarlyStopping(monitor='val_loss', patience=2, min_delta=0.001)])
Epoch 1/10 60/60 [==============================] - 284s 5s/step - loss: 0.6689 - accuracy: 0.6477 - val_loss: 0.6505 - val_accuracy: 0.6812 Epoch 2/10 60/60 [==============================] - 308s 5s/step - loss: 0.5731 - accuracy: 0.7156 - val_loss: 0.5848 - val_accuracy: 0.7177 Epoch 3/10 60/60 [==============================] - 308s 5s/step - loss: 0.5438 - accuracy: 0.7365 - val_loss: 0.5653 - val_accuracy: 0.7198 Epoch 4/10 60/60 [==============================] - 321s 5s/step - loss: 0.5135 - accuracy: 0.7565 - val_loss: 0.5697 - val_accuracy: 0.7219 Epoch 5/10 60/60 [==============================] - 388s 6s/step - loss: 0.4840 - accuracy: 0.7763 - val_loss: 0.5680 - val_accuracy: 0.7208
RESULTS = pd.DataFrame()
loss_1, accuracy_1 = model.evaluate(X_test,y_test)
RESULTS = RESULTS.append(pd.Series(["model",np.round((loss_1*100),2),np.round((accuracy_1*100),2)]),ignore_index=True)
loss_2, accuracy_2 = model_2.evaluate(X_test,y_test)
RESULTS = RESULTS.append(pd.Series(["model_2",np.round((loss_2*100),2),np.round((accuracy_2*100),2)]),ignore_index=True)
loss_2a, accuracy_2a = model_2a.evaluate(X_test,y_test)
RESULTS = RESULTS.append(pd.Series(["model_2a",np.round((loss_2a*100),2),np.round((accuracy_2a*100),2)]),ignore_index=True)
loss_3, accuracy_3 = model_3.evaluate(X_test,y_test)
RESULTS = RESULTS.append(pd.Series(["model_3",np.round((loss_3*100),2),np.round((accuracy_3*100),2)]),ignore_index=True)
RESULTS.columns = ['MODEL_NAME','LOSS','ACCURACY']
RESULTS.style.set_properties(**{'border': '1.3px solid black','color': 'blue'}).hide_index()
38/38 [==============================] - 21s 531ms/step - loss: 248510.5312 - accuracy: 0.6867 38/38 [==============================] - 10s 257ms/step - loss: 107.0487 - accuracy: 0.5067 38/38 [==============================] - 30s 773ms/step - loss: 83549.0156 - accuracy: 0.6992 38/38 [==============================] - 23s 603ms/step - loss: 0.5760 - accuracy: 0.7067
| MODEL_NAME | LOSS | ACCURACY |
|---|---|---|
| model | 24851053.120000 | 68.670000 |
| model_2 | 10704.870000 | 50.670000 |
| model_2a | 8354901.560000 | 69.920000 |
| model_3 | 57.600000 | 70.670000 |
RESULTS.to_csv("Add_Update_Model_Layers.csv",index = False)
Observations:
* We have added new layers, kernel_optimizers, activity_optimizers, padding, strides to the layers.
* There is no improvement in the model accuracies.
optimizer = RMSprop(learning_rate=0.00001, rho=0.9, momentum=0.9, epsilon=1e-07)
model.compile(optimizer = optimizer , loss = "binary_crossentropy", metrics=["accuracy"])
model.summary()
Model: "sequential"
_________________________________________________________________
Layer (type) Output Shape Param #
=================================================================
conv2d (Conv2D) (None, 224, 224, 32) 896
max_pooling2d (MaxPooling2D (None, 112, 112, 32) 0
)
batch_normalization (BatchN (None, 112, 112, 32) 128
ormalization)
conv2d_1 (Conv2D) (None, 112, 112, 64) 18496
max_pooling2d_1 (MaxPooling (None, 56, 56, 64) 0
2D)
batch_normalization_1 (Batc (None, 56, 56, 64) 256
hNormalization)
dropout (Dropout) (None, 56, 56, 64) 0
flatten (Flatten) (None, 200704) 0
dense (Dense) (None, 128) 25690240
dense_1 (Dense) (None, 64) 8256
dense_2 (Dense) (None, 2) 130
=================================================================
Total params: 25,718,402
Trainable params: 25,718,210
Non-trainable params: 192
_________________________________________________________________
batch_size = 64
nb_epochs = 10
history = model.fit(X_train1, y_train1,
batch_size=batch_size,
epochs=nb_epochs,
validation_data=(X_val, y_val),
initial_epoch=0, callbacks=[EarlyStopping(monitor='val_loss', patience=2, min_delta=0.001)])
Epoch 1/10 60/60 [==============================] - 446s 7s/step - loss: 194680.3594 - accuracy: 0.8076 - val_loss: 143168.9531 - val_accuracy: 0.6875 Epoch 2/10 60/60 [==============================] - 394s 7s/step - loss: 108535.7344 - accuracy: 0.8375 - val_loss: 78112.7812 - val_accuracy: 0.5594 Epoch 3/10 60/60 [==============================] - 386s 6s/step - loss: 57126.1641 - accuracy: 0.8503 - val_loss: 39935.3047 - val_accuracy: 0.5979 Epoch 4/10 60/60 [==============================] - 376s 6s/step - loss: 29403.8457 - accuracy: 0.8768 - val_loss: 21068.7168 - val_accuracy: 0.6375 Epoch 5/10 60/60 [==============================] - 384s 6s/step - loss: 13762.1348 - accuracy: 0.8974 - val_loss: 9990.2607 - val_accuracy: 0.5000 Epoch 6/10 60/60 [==============================] - 382s 6s/step - loss: 4994.7144 - accuracy: 0.9122 - val_loss: 5183.1089 - val_accuracy: 0.5000 Epoch 7/10 60/60 [==============================] - 395s 7s/step - loss: 1362.1005 - accuracy: 0.8971 - val_loss: 3233.0425 - val_accuracy: 0.5000 Epoch 8/10 60/60 [==============================] - 397s 7s/step - loss: 206.5308 - accuracy: 0.8964 - val_loss: 2365.9502 - val_accuracy: 0.5000 Epoch 9/10 60/60 [==============================] - 397s 7s/step - loss: 64.5132 - accuracy: 0.8818 - val_loss: 1478.6406 - val_accuracy: 0.5000 Epoch 10/10 60/60 [==============================] - 396s 7s/step - loss: 32.3362 - accuracy: 0.8581 - val_loss: 280.6079 - val_accuracy: 0.5000
optimizer = RMSprop(learning_rate=0.00001, rho=0.9, momentum=0.9, epsilon=1e-07)
model_2.compile(optimizer = optimizer , loss = "binary_crossentropy", metrics=["accuracy"])
model_2.summary()
Model: "sequential"
_________________________________________________________________
Layer (type) Output Shape Param #
=================================================================
conv2d (Conv2D) (None, 112, 112, 32) 4736
max_pooling2d (MaxPooling2D (None, 56, 56, 32) 0
)
batch_normalization (BatchN (None, 56, 56, 32) 128
ormalization)
conv2d_1 (Conv2D) (None, 56, 56, 64) 51264
max_pooling2d_1 (MaxPooling (None, 28, 28, 64) 0
2D)
batch_normalization_1 (Batc (None, 28, 28, 64) 256
hNormalization)
dropout (Dropout) (None, 28, 28, 64) 0
conv2d_2 (Conv2D) (None, 28, 28, 128) 73856
max_pooling2d_2 (MaxPooling (None, 14, 14, 128) 0
2D)
batch_normalization_2 (Batc (None, 14, 14, 128) 512
hNormalization)
flatten (Flatten) (None, 25088) 0
dense (Dense) (None, 128) 3211392
dense_1 (Dense) (None, 84) 10836
dense_2 (Dense) (None, 42) 3570
dense_3 (Dense) (None, 2) 86
=================================================================
Total params: 3,356,636
Trainable params: 3,356,188
Non-trainable params: 448
_________________________________________________________________
batch_size = 64
nb_epochs = 10
history_2 = model_2.fit(X_train1, y_train1,
batch_size=batch_size,
epochs=nb_epochs,
validation_data=(X_val, y_val),
initial_epoch=0, callbacks=[EarlyStopping(monitor='val_loss', patience=2, min_delta=0.001)])
Epoch 1/10 60/60 [==============================] - 162s 3s/step - loss: 45.5135 - accuracy: 0.6370 - val_loss: 17.6134 - val_accuracy: 0.5146 Epoch 2/10 60/60 [==============================] - 160s 3s/step - loss: 12.8880 - accuracy: 0.5891 - val_loss: 6.5359 - val_accuracy: 0.5010 Epoch 3/10 60/60 [==============================] - 160s 3s/step - loss: 6.4028 - accuracy: 0.5654 - val_loss: 3.6476 - val_accuracy: 0.4979 Epoch 4/10 60/60 [==============================] - 174s 3s/step - loss: 4.1662 - accuracy: 0.5719 - val_loss: 2.7279 - val_accuracy: 0.5042 Epoch 5/10 60/60 [==============================] - 177s 3s/step - loss: 3.1581 - accuracy: 0.5792 - val_loss: 2.3110 - val_accuracy: 0.5042 Epoch 6/10 60/60 [==============================] - 177s 3s/step - loss: 2.5679 - accuracy: 0.5659 - val_loss: 2.0630 - val_accuracy: 0.5042 Epoch 7/10 60/60 [==============================] - 177s 3s/step - loss: 2.1788 - accuracy: 0.5648 - val_loss: 1.8385 - val_accuracy: 0.5052 Epoch 8/10 60/60 [==============================] - 177s 3s/step - loss: 1.8797 - accuracy: 0.5539 - val_loss: 1.6489 - val_accuracy: 0.5052 Epoch 9/10 60/60 [==============================] - 177s 3s/step - loss: 1.6528 - accuracy: 0.5508 - val_loss: 1.4988 - val_accuracy: 0.5031 Epoch 10/10 60/60 [==============================] - 177s 3s/step - loss: 1.4750 - accuracy: 0.5549 - val_loss: 1.3631 - val_accuracy: 0.5042
optimizer = RMSprop(learning_rate=0.00001, rho=0.9, momentum=0.9, epsilon=1e-07)
model_2a.compile(optimizer = optimizer , loss = "binary_crossentropy", metrics=["accuracy"])
model_2a.summary()
Model: "sequential"
_________________________________________________________________
Layer (type) Output Shape Param #
=================================================================
conv2d (Conv2D) (None, 220, 220, 32) 2432
max_pooling2d (MaxPooling2D (None, 110, 110, 32) 0
)
batch_normalization (BatchN (None, 110, 110, 32) 128
ormalization)
conv2d_1 (Conv2D) (None, 106, 106, 64) 51264
max_pooling2d_1 (MaxPooling (None, 53, 53, 64) 0
2D)
batch_normalization_1 (Batc (None, 53, 53, 64) 256
hNormalization)
conv2d_2 (Conv2D) (None, 51, 51, 128) 73856
max_pooling2d_2 (MaxPooling (None, 25, 25, 128) 0
2D)
batch_normalization_2 (Batc (None, 25, 25, 128) 512
hNormalization)
dropout (Dropout) (None, 25, 25, 128) 0
conv2d_3 (Conv2D) (None, 23, 23, 256) 295168
max_pooling2d_3 (MaxPooling (None, 11, 11, 256) 0
2D)
batch_normalization_3 (Batc (None, 11, 11, 256) 1024
hNormalization)
dropout_1 (Dropout) (None, 11, 11, 256) 0
flatten (Flatten) (None, 30976) 0
dense (Dense) (None, 256) 7930112
dropout_2 (Dropout) (None, 256) 0
dense_1 (Dense) (None, 128) 32896
dense_2 (Dense) (None, 64) 8256
dense_3 (Dense) (None, 32) 2080
dense_4 (Dense) (None, 2) 66
=================================================================
Total params: 8,398,050
Trainable params: 8,397,090
Non-trainable params: 960
_________________________________________________________________
batch_size = 64
nb_epochs = 10
history_2a = model_2a.fit(X_train1, y_train1,
batch_size=batch_size,
epochs=nb_epochs,
validation_data=(X_val, y_val),
initial_epoch=0, callbacks=[EarlyStopping(monitor='val_loss', patience=2, min_delta=0.001)])
Epoch 1/10 60/60 [==============================] - 542s 9s/step - loss: 37686.1016 - accuracy: 0.6768 - val_loss: 9170.2979 - val_accuracy: 0.5635 Epoch 2/10 60/60 [==============================] - 567s 9s/step - loss: 2617.6323 - accuracy: 0.6745 - val_loss: 5595.0859 - val_accuracy: 0.5000 Epoch 3/10 60/60 [==============================] - 564s 9s/step - loss: 416.0942 - accuracy: 0.6419 - val_loss: 7352.5571 - val_accuracy: 0.5000 Epoch 4/10 60/60 [==============================] - 529s 9s/step - loss: 140.3253 - accuracy: 0.5997 - val_loss: 7707.4409 - val_accuracy: 0.5000
optimizer = RMSprop(learning_rate=0.00001, rho=0.9, momentum=0.9, epsilon=1e-07)
model_3.compile(optimizer = optimizer , loss = "binary_crossentropy", metrics=["accuracy"])
model_3.summary()
Model: "sequential"
_________________________________________________________________
Layer (type) Output Shape Param #
=================================================================
conv2d (Conv2D) (None, 224, 224, 32) 896
max_pooling2d (MaxPooling2D (None, 112, 112, 32) 0
)
batch_normalization (BatchN (None, 112, 112, 32) 128
ormalization)
conv2d_1 (Conv2D) (None, 112, 112, 64) 18496
max_pooling2d_1 (MaxPooling (None, 56, 56, 64) 0
2D)
batch_normalization_1 (Batc (None, 56, 56, 64) 256
hNormalization)
dropout (Dropout) (None, 56, 56, 64) 0
conv2d_2 (Conv2D) (None, 56, 56, 128) 73856
max_pooling2d_2 (MaxPooling (None, 28, 28, 128) 0
2D)
batch_normalization_2 (Batc (None, 28, 28, 128) 512
hNormalization)
conv2d_3 (Conv2D) (None, 28, 28, 256) 295168
max_pooling2d_3 (MaxPooling (None, 14, 14, 256) 0
2D)
batch_normalization_3 (Batc (None, 14, 14, 256) 1024
hNormalization)
flatten (Flatten) (None, 50176) 0
dense (Dense) (None, 824) 41345848
dense_1 (Dense) (None, 84) 69300
dense_2 (Dense) (None, 42) 3570
dropout_1 (Dropout) (None, 42) 0
dense_3 (Dense) (None, 2) 86
=================================================================
Total params: 41,809,140
Trainable params: 41,808,180
Non-trainable params: 960
_________________________________________________________________
batch_size = 64
nb_epochs = 10
history_3 = model_3.fit(X_train1, y_train1,
batch_size=batch_size,
epochs=nb_epochs,
validation_data=(X_val, y_val),
initial_epoch=0, callbacks=[EarlyStopping(monitor='val_loss', patience=2, min_delta=0.001)])
Epoch 1/10 60/60 [==============================] - 456s 8s/step - loss: 0.6022 - accuracy: 0.7229 - val_loss: 0.5698 - val_accuracy: 0.7208 Epoch 2/10 60/60 [==============================] - 452s 8s/step - loss: 0.5490 - accuracy: 0.7518 - val_loss: 0.5737 - val_accuracy: 0.7208 Epoch 3/10 60/60 [==============================] - 453s 8s/step - loss: 0.4865 - accuracy: 0.7812 - val_loss: 0.5845 - val_accuracy: 0.7208
RESULTS = pd.DataFrame()
loss_1, accuracy_1 = model.evaluate(X_test,y_test)
RESULTS = RESULTS.append(pd.Series(["model",np.round((loss_1*100),2),np.round((accuracy_1*100),2)]),ignore_index=True)
loss_2, accuracy_2 = model_2.evaluate(X_test,y_test)
RESULTS = RESULTS.append(pd.Series(["model_2",np.round((loss_2*100),2),np.round((accuracy_2*100),2)]),ignore_index=True)
loss_2a, accuracy_2a = model_2a.evaluate(X_test,y_test)
RESULTS = RESULTS.append(pd.Series(["model_2a",np.round((loss_2a*100),2),np.round((accuracy_2a*100),2)]),ignore_index=True)
loss_3, accuracy_3 = model_3.evaluate(X_test,y_test)
RESULTS = RESULTS.append(pd.Series(["model_3",np.round((loss_3*100),2),np.round((accuracy_3*100),2)]),ignore_index=True)
RESULTS.columns = ['MODEL_NAME','LOSS','ACCURACY']
RESULTS.style.set_properties(**{'border': '1.3px solid black','color': 'blue'}).hide_index()
38/38 [==============================] - 19s 481ms/step - loss: 280.7041 - accuracy: 0.5000 38/38 [==============================] - 8s 213ms/step - loss: 1.3595 - accuracy: 0.5075 38/38 [==============================] - 28s 716ms/step - loss: 7709.5459 - accuracy: 0.5000 38/38 [==============================] - 21s 540ms/step - loss: 0.6297 - accuracy: 0.7083
| MODEL_NAME | LOSS | ACCURACY |
|---|---|---|
| model | 28070.410000 | 50.000000 |
| model_2 | 135.950000 | 50.750000 |
| model_2a | 770954.590000 | 50.000000 |
| model_3 | 62.970000 | 70.830000 |
RESULTS.to_csv("RMSProp_Optimizer.csv",index = False)
Train_jpg, Test_jpg = train_test_split(NAMES_DF_FINAL, test_size=0.2, random_state=50,stratify=NAMES_DF_FINAL['Image_Target'])
Train_jpg1, Val_jpg = train_test_split(Train_jpg, test_size=0.2, random_state=50,stratify=Train_jpg['Image_Target'])
* Split the 6000 jpeg images df into train, test and validation sets.
* 3 Folders to be created - Training / Validation / Test Images folder. Seggregate the 6000 images into 3 folders as per test train split.
* We have 3840 Images in Training folder, 1200 in Testing Folder and 960 in Validation Folder
print('Shape of Training Images:',Train_jpg1.shape)
print("")
print('Shape of Testing Images:',Test_jpg.shape)
print("")
print('Shape of Validation Images:',Val_jpg.shape)
Shape of Training Images: (3840, 6) Shape of Testing Images: (1200, 6) Shape of Validation Images: (960, 6)
dst1 = "Training_Images"
dst2 = "Validation_Images"
dst3 = "Testing_Images"
mode = 0o777
os.mkdir(dst1, mode)
os.mkdir(dst2, mode)
os.mkdir(dst3, mode)
# Source path
source = "Training_jpg"
# Destination path
destination = "Training_Images"
for item in Train_jpg1['NAMES_JPG']:
file = item
src_file = os.path.join(source+"/"+file)
dst_file = os.path.join(destination+"/"+file)
#print("src_file",src_file)
#print("dst_file",dst_file)
shutil.copyfile(src_file, dst_file)
# Source path
source = "Training_jpg"
# Destination path
destination = "Validation_Images"
for item in Val_jpg['NAMES_JPG']:
file = item
src_file = os.path.join(source+"/"+file)
dst_file = os.path.join(destination+"/"+file)
#print("src_file",src_file)
#print("dst_file",dst_file)
shutil.copyfile(src_file, dst_file)
# Source path
source = "Training_jpg"
# Destination path
destination = "Testing_Images"
for item in Test_jpg['NAMES_JPG']:
file = item
src_file = os.path.join(source+"/"+file)
dst_file = os.path.join(destination+"/"+file)
#print("src_file",src_file)
#print("dst_file",dst_file)
shutil.copyfile(src_file, dst_file)
Train_images_count = os.listdir("Training_Images") # your directory path
Training_number_files = len(Train_images_count)
print('Number of images in Training Folder:',Training_number_files)
Validation_images_count = os.listdir("Validation_Images") # your directory path
Validation_number_files = len(Validation_images_count)
print('\nNumber of images in Validation Folder:',Validation_number_files)
Test_images_count = os.listdir("Testing_Images") # your directory path
Test_number_files = len(Test_images_count)
print('\nNumber of images in Test Folder:',Test_number_files)
Number of images in Training Folder: 3840 Number of images in Validation Folder: 960 Number of images in Test Folder: 1200
Train_Target_category = tf.keras.utils.to_categorical(Train_jpg1["Image_Target"], num_classes=2)
Val_Target_category = tf.keras.utils.to_categorical(Val_jpg["Image_Target"], num_classes=2)
Test_Target_category = tf.keras.utils.to_categorical(Test_jpg["Image_Target"], num_classes=2)
Train_TC = np.asarray(Train_jpg1["Image_Target"]).astype('float32').reshape((-1,1))
Val_TC = np.asarray(Val_jpg["Image_Target"]).astype('float32').reshape((-1,1))
Test_TC = np.asarray(Test_jpg["Image_Target"]).astype('float32').reshape((-1,1))
Train_jpg1["Target_Cat"] = Train_jpg1['Image_Target'].apply(lambda x: "No" if x == 0 else "Yes")
Val_jpg["Target_Cat"] = Val_jpg['Image_Target'].apply(lambda x: "No" if x == 0 else "Yes")
Test_jpg["Target_Cat"] = Test_jpg['Image_Target'].apply(lambda x: "No" if x == 0 else "Yes")
Train_jpg1.sample(2)
| NAMES_JPG | NAMES_DCM | Image_Class | Image_Class_Category | Image_Target | Full_filename | Target_Cat | |
|---|---|---|---|---|---|---|---|
| 3272 | da0d705d-7722-43d1-a880-a34c553f0719.jpg | da0d705d-7722-43d1-a880-a34c553f0719.dcm | Lung Opacity | 0 | 1 | Training_jpg/da0d705d-7722-43d1-a880-a34c553f0719.jpg | Yes |
| 5152 | 00704310-78a8-4b38-8475-49f4573b2dbb.jpg | 00704310-78a8-4b38-8475-49f4573b2dbb.dcm | Lung Opacity | 0 | 1 | Training_jpg/00704310-78a8-4b38-8475-49f4573b2dbb.jpg | Yes |
Train_jpg1.sample(3)
| NAMES_JPG | NAMES_DCM | Image_Class | Image_Class_Category | Image_Target | Full_filename | Target_Cat | |
|---|---|---|---|---|---|---|---|
| 23 | 6326e2e5-d349-42e2-b42e-8dd63bc40d19.jpg | 6326e2e5-d349-42e2-b42e-8dd63bc40d19.dcm | Not Normal | 2 | 0 | Training_jpg/6326e2e5-d349-42e2-b42e-8dd63bc40d19.jpg | No |
| 4737 | be9d0ed7-00ce-44b5-a83c-8a9796be8ea6.jpg | be9d0ed7-00ce-44b5-a83c-8a9796be8ea6.dcm | Lung Opacity | 0 | 1 | Training_jpg/be9d0ed7-00ce-44b5-a83c-8a9796be8ea6.jpg | Yes |
| 1327 | 657b1ec2-8f10-4070-bfde-db020bbdb2fe.jpg | 657b1ec2-8f10-4070-bfde-db020bbdb2fe.dcm | Normal | 1 | 0 | Training_jpg/657b1ec2-8f10-4070-bfde-db020bbdb2fe.jpg | No |
train_datagen = ImageDataGenerator(preprocessing_function = preprocess_input,
rotation_range = 20,
width_shift_range = 0.2,
height_shift_range = 0.2,
shear_range=0.2,
zoom_range=0.2)
'''
train_datagen = ImageDataGenerator(preprocessing_function = preprocess_input,
rotation_range = 20,
width_shift_range = 0.2,
height_shift_range = 0.2,
shear_range = 0.2,
zoom_range = 0.2,
horizontal_flip = True,
fill_mode = 'nearest')
'''
"\ntrain_datagen = ImageDataGenerator(preprocessing_function = preprocess_input,\n rotation_range = 20,\n width_shift_range = 0.2,\n height_shift_range = 0.2,\n shear_range = 0.2,\n zoom_range = 0.2,\n horizontal_flip = True,\n fill_mode = 'nearest')\n"
train_generator = train_datagen.flow_from_dataframe(
dataframe = Train_jpg1,
directory = "Training_Images",
x_col = "NAMES_JPG",
y_col = "Target_Cat",
class_mode = "categorical",
target_size = (224,224),
batch_size = 32, # batch_size,
shuffle = True)
Found 3840 validated image filenames belonging to 2 classes.
valid_datagen = ImageDataGenerator(preprocessing_function = preprocess_input)
valid_generator = valid_datagen.flow_from_dataframe(
dataframe=Val_jpg,
directory="Validation_Images",
x_col = "NAMES_JPG",
y_col="Target_Cat",
class_mode="categorical",
target_size=(224,224),
batch_size=32, # batch_size,
shuffle = True)
Found 960 validated image filenames belonging to 2 classes.
Test_datagen = ImageDataGenerator(preprocessing_function = preprocess_input)
Test_generator = Test_datagen.flow_from_dataframe(
dataframe=Test_jpg,
directory="Testing_Images",
x_col = "NAMES_JPG",
y_col="Target_Cat",
class_mode="categorical",
target_size=(224,224),
batch_size=32, # batch_size,
shuffle = True)
Found 1200 validated image filenames belonging to 2 classes.
def round_up(n, decimals=0):
multiplier = 10 ** decimals
return math.ceil(n * multiplier) / multiplier
STEP_SIZE_TRAIN=round_up(train_generator.n/train_generator.batch_size)
STEP_SIZE_VALID=round_up(valid_generator.n/valid_generator.batch_size)
optimizer = Adam(learning_rate=0.00001, beta_1=0.9, beta_2=0.999, epsilon=1e-07)
model.compile(optimizer = optimizer , loss = "binary_crossentropy", metrics=["accuracy"])
model.summary()
Model: "sequential"
_________________________________________________________________
Layer (type) Output Shape Param #
=================================================================
conv2d (Conv2D) (None, 224, 224, 32) 896
max_pooling2d (MaxPooling2D (None, 112, 112, 32) 0
)
batch_normalization (BatchN (None, 112, 112, 32) 128
ormalization)
conv2d_1 (Conv2D) (None, 112, 112, 64) 18496
max_pooling2d_1 (MaxPooling (None, 56, 56, 64) 0
2D)
batch_normalization_1 (Batc (None, 56, 56, 64) 256
hNormalization)
dropout (Dropout) (None, 56, 56, 64) 0
flatten (Flatten) (None, 200704) 0
dense (Dense) (None, 128) 25690240
dense_1 (Dense) (None, 64) 8256
dense_2 (Dense) (None, 2) 130
=================================================================
Total params: 25,718,402
Trainable params: 25,718,210
Non-trainable params: 192
_________________________________________________________________
epochs = 10
history = model.fit_generator(
generator=train_generator,
steps_per_epoch=STEP_SIZE_TRAIN,
validation_data=valid_generator,
validation_steps=STEP_SIZE_VALID,
epochs=epochs)
Epoch 1/10 120/120 [==============================] - 376s 3s/step - loss: 430164.6250 - accuracy: 0.5417 - val_loss: 385582.2812 - val_accuracy: 0.6365 Epoch 2/10 120/120 [==============================] - 343s 3s/step - loss: 416412.6875 - accuracy: 0.6021 - val_loss: 376150.7188 - val_accuracy: 0.6521 Epoch 3/10 120/120 [==============================] - 342s 3s/step - loss: 402261.3438 - accuracy: 0.5992 - val_loss: 365302.6250 - val_accuracy: 0.6187 Epoch 4/10 120/120 [==============================] - 341s 3s/step - loss: 392648.4688 - accuracy: 0.6021 - val_loss: 354030.9062 - val_accuracy: 0.6250 Epoch 5/10 120/120 [==============================] - 341s 3s/step - loss: 375942.5625 - accuracy: 0.6128 - val_loss: 342611.8125 - val_accuracy: 0.6323 Epoch 6/10 120/120 [==============================] - 342s 3s/step - loss: 365785.9688 - accuracy: 0.6096 - val_loss: 331324.6562 - val_accuracy: 0.6240 Epoch 7/10 120/120 [==============================] - 341s 3s/step - loss: 354067.9375 - accuracy: 0.6154 - val_loss: 319875.2812 - val_accuracy: 0.5958 Epoch 8/10 120/120 [==============================] - 341s 3s/step - loss: 340272.1250 - accuracy: 0.6359 - val_loss: 308486.1562 - val_accuracy: 0.6354 Epoch 9/10 120/120 [==============================] - 341s 3s/step - loss: 327947.3125 - accuracy: 0.6258 - val_loss: 297119.1250 - val_accuracy: 0.6042 Epoch 10/10 120/120 [==============================] - 340s 3s/step - loss: 320065.5625 - accuracy: 0.6339 - val_loss: 285674.8438 - val_accuracy: 0.6417
optimizer = Adam(learning_rate=0.00001, beta_1=0.9, beta_2=0.999, epsilon=1e-07)
model_2.compile(optimizer = optimizer , loss = "categorical_crossentropy", metrics=["accuracy"])
model_2.summary()
Model: "sequential"
_________________________________________________________________
Layer (type) Output Shape Param #
=================================================================
conv2d (Conv2D) (None, 112, 112, 32) 4736
max_pooling2d (MaxPooling2D (None, 56, 56, 32) 0
)
batch_normalization (BatchN (None, 56, 56, 32) 128
ormalization)
conv2d_1 (Conv2D) (None, 56, 56, 64) 51264
max_pooling2d_1 (MaxPooling (None, 28, 28, 64) 0
2D)
batch_normalization_1 (Batc (None, 28, 28, 64) 256
hNormalization)
dropout (Dropout) (None, 28, 28, 64) 0
conv2d_2 (Conv2D) (None, 28, 28, 128) 73856
max_pooling2d_2 (MaxPooling (None, 14, 14, 128) 0
2D)
batch_normalization_2 (Batc (None, 14, 14, 128) 512
hNormalization)
flatten (Flatten) (None, 25088) 0
dense (Dense) (None, 128) 3211392
dense_1 (Dense) (None, 84) 10836
dense_2 (Dense) (None, 42) 3570
dense_3 (Dense) (None, 2) 86
=================================================================
Total params: 3,356,636
Trainable params: 3,356,188
Non-trainable params: 448
_________________________________________________________________
epochs = 10
history_2 = model_2.fit_generator(
generator=train_generator,
steps_per_epoch=STEP_SIZE_TRAIN,
validation_data=valid_generator,
validation_steps=STEP_SIZE_VALID,
epochs=epochs)
Epoch 1/10 120/120 [==============================] - 263s 2s/step - loss: 216717.4844 - accuracy: 0.5388 - val_loss: 190052.9375 - val_accuracy: 0.5979 Epoch 2/10 120/120 [==============================] - 262s 2s/step - loss: 202214.7188 - accuracy: 0.5151 - val_loss: 175923.6406 - val_accuracy: 0.5000 Epoch 3/10 120/120 [==============================] - 261s 2s/step - loss: 185344.5156 - accuracy: 0.5229 - val_loss: 162092.9688 - val_accuracy: 0.5094 Epoch 4/10 120/120 [==============================] - 268s 2s/step - loss: 171409.0625 - accuracy: 0.5073 - val_loss: 148565.8438 - val_accuracy: 0.5010 Epoch 5/10 120/120 [==============================] - 268s 2s/step - loss: 155862.4219 - accuracy: 0.5023 - val_loss: 135087.5625 - val_accuracy: 0.5000 Epoch 6/10 120/120 [==============================] - 264s 2s/step - loss: 142000.7031 - accuracy: 0.5078 - val_loss: 121572.3828 - val_accuracy: 0.5000 Epoch 7/10 120/120 [==============================] - 256s 2s/step - loss: 125898.8359 - accuracy: 0.5057 - val_loss: 108194.5078 - val_accuracy: 0.5000 Epoch 8/10 120/120 [==============================] - 256s 2s/step - loss: 111337.4141 - accuracy: 0.5063 - val_loss: 94847.7188 - val_accuracy: 0.5000 Epoch 9/10 120/120 [==============================] - 257s 2s/step - loss: 96196.2812 - accuracy: 0.5023 - val_loss: 81598.0859 - val_accuracy: 0.5000 Epoch 10/10 120/120 [==============================] - 255s 2s/step - loss: 81925.5234 - accuracy: 0.5003 - val_loss: 68515.7734 - val_accuracy: 0.5000
optimizer = Adam(learning_rate=0.00001, beta_1=0.9, beta_2=0.999, epsilon=1e-07)
model_2a.compile(optimizer = optimizer , loss = "categorical_crossentropy", metrics=["accuracy"])
model_2a.summary()
Model: "sequential"
_________________________________________________________________
Layer (type) Output Shape Param #
=================================================================
conv2d (Conv2D) (None, 220, 220, 32) 2432
max_pooling2d (MaxPooling2D (None, 110, 110, 32) 0
)
batch_normalization (BatchN (None, 110, 110, 32) 128
ormalization)
conv2d_1 (Conv2D) (None, 106, 106, 64) 51264
max_pooling2d_1 (MaxPooling (None, 53, 53, 64) 0
2D)
batch_normalization_1 (Batc (None, 53, 53, 64) 256
hNormalization)
conv2d_2 (Conv2D) (None, 51, 51, 128) 73856
max_pooling2d_2 (MaxPooling (None, 25, 25, 128) 0
2D)
batch_normalization_2 (Batc (None, 25, 25, 128) 512
hNormalization)
dropout (Dropout) (None, 25, 25, 128) 0
conv2d_3 (Conv2D) (None, 23, 23, 256) 295168
max_pooling2d_3 (MaxPooling (None, 11, 11, 256) 0
2D)
batch_normalization_3 (Batc (None, 11, 11, 256) 1024
hNormalization)
dropout_1 (Dropout) (None, 11, 11, 256) 0
flatten (Flatten) (None, 30976) 0
dense (Dense) (None, 256) 7930112
dropout_2 (Dropout) (None, 256) 0
dense_1 (Dense) (None, 128) 32896
dense_2 (Dense) (None, 64) 8256
dense_3 (Dense) (None, 32) 2080
dense_4 (Dense) (None, 2) 66
=================================================================
Total params: 8,398,050
Trainable params: 8,397,090
Non-trainable params: 960
_________________________________________________________________
epochs = 10
history_2a = model_2a.fit_generator(
generator=train_generator,
steps_per_epoch=STEP_SIZE_TRAIN,
validation_data=valid_generator,
validation_steps=STEP_SIZE_VALID,
epochs=epochs)
Epoch 1/10 120/120 [==============================] - 749s 6s/step - loss: 436355.0000 - accuracy: 0.5714 - val_loss: 372988.6875 - val_accuracy: 0.5375 Epoch 2/10 120/120 [==============================] - 720s 6s/step - loss: 396310.9688 - accuracy: 0.5883 - val_loss: 344970.6250 - val_accuracy: 0.5000 Epoch 3/10 120/120 [==============================] - 3376s 28s/step - loss: 364014.9688 - accuracy: 0.5401 - val_loss: 316454.5625 - val_accuracy: 0.5000 Epoch 4/10 120/120 [==============================] - 714s 6s/step - loss: 339658.0000 - accuracy: 0.5451 - val_loss: 289452.6875 - val_accuracy: 0.4990 Epoch 5/10 120/120 [==============================] - 720s 6s/step - loss: 305067.5000 - accuracy: 0.5596 - val_loss: 263146.4688 - val_accuracy: 0.5625 Epoch 6/10 120/120 [==============================] - 5765s 48s/step - loss: 277047.8438 - accuracy: 0.5688 - val_loss: 237048.2656 - val_accuracy: 0.5146 Epoch 7/10 120/120 [==============================] - 588s 5s/step - loss: 249290.9531 - accuracy: 0.5802 - val_loss: 211155.3906 - val_accuracy: 0.5354 Epoch 8/10 120/120 [==============================] - 570s 5s/step - loss: 218804.9062 - accuracy: 0.5734 - val_loss: 186023.1562 - val_accuracy: 0.5740 Epoch 9/10 120/120 [==============================] - 576s 5s/step - loss: 192098.8281 - accuracy: 0.5698 - val_loss: 162182.0938 - val_accuracy: 0.6187 Epoch 10/10 120/120 [==============================] - 572s 5s/step - loss: 163746.9219 - accuracy: 0.5786 - val_loss: 140253.5312 - val_accuracy: 0.6333
optimizer = Adam(learning_rate=0.00001, beta_1=0.9, beta_2=0.999, epsilon=1e-07)
model_3.compile(optimizer = optimizer , loss = "categorical_crossentropy", metrics=["accuracy"])
model_3.summary()
Model: "sequential"
_________________________________________________________________
Layer (type) Output Shape Param #
=================================================================
conv2d (Conv2D) (None, 224, 224, 32) 896
max_pooling2d (MaxPooling2D (None, 112, 112, 32) 0
)
batch_normalization (BatchN (None, 112, 112, 32) 128
ormalization)
conv2d_1 (Conv2D) (None, 112, 112, 64) 18496
max_pooling2d_1 (MaxPooling (None, 56, 56, 64) 0
2D)
batch_normalization_1 (Batc (None, 56, 56, 64) 256
hNormalization)
dropout (Dropout) (None, 56, 56, 64) 0
conv2d_2 (Conv2D) (None, 56, 56, 128) 73856
max_pooling2d_2 (MaxPooling (None, 28, 28, 128) 0
2D)
batch_normalization_2 (Batc (None, 28, 28, 128) 512
hNormalization)
conv2d_3 (Conv2D) (None, 28, 28, 256) 295168
max_pooling2d_3 (MaxPooling (None, 14, 14, 256) 0
2D)
batch_normalization_3 (Batc (None, 14, 14, 256) 1024
hNormalization)
flatten (Flatten) (None, 50176) 0
dense (Dense) (None, 824) 41345848
dense_1 (Dense) (None, 84) 69300
dense_2 (Dense) (None, 42) 3570
dropout_1 (Dropout) (None, 42) 0
dense_3 (Dense) (None, 2) 86
=================================================================
Total params: 41,809,140
Trainable params: 41,808,180
Non-trainable params: 960
_________________________________________________________________
epochs = 10
history_3 = model_3.fit_generator(
generator=train_generator,
steps_per_epoch=STEP_SIZE_TRAIN,
validation_data=valid_generator,
validation_steps=STEP_SIZE_VALID,
epochs=epochs)
Epoch 1/10 120/120 [==============================] - 464s 4s/step - loss: 0.8961 - accuracy: 0.5992 - val_loss: 0.6608 - val_accuracy: 0.5865 Epoch 2/10 120/120 [==============================] - 417s 3s/step - loss: 0.6789 - accuracy: 0.6359 - val_loss: 0.6316 - val_accuracy: 0.6646 Epoch 3/10 120/120 [==============================] - 441s 4s/step - loss: 0.6567 - accuracy: 0.6500 - val_loss: 0.6065 - val_accuracy: 0.6687 Epoch 4/10 120/120 [==============================] - 474s 4s/step - loss: 0.6378 - accuracy: 0.6703 - val_loss: 0.5803 - val_accuracy: 0.7063 Epoch 5/10 120/120 [==============================] - 591s 5s/step - loss: 0.6161 - accuracy: 0.6753 - val_loss: 0.5961 - val_accuracy: 0.6823 Epoch 6/10 120/120 [==============================] - 582s 5s/step - loss: 0.6287 - accuracy: 0.6661 - val_loss: 0.5706 - val_accuracy: 0.7125 Epoch 7/10 120/120 [==============================] - 577s 5s/step - loss: 0.6133 - accuracy: 0.6737 - val_loss: 0.5638 - val_accuracy: 0.7135 Epoch 8/10 120/120 [==============================] - 576s 5s/step - loss: 0.6160 - accuracy: 0.6763 - val_loss: 0.5827 - val_accuracy: 0.6833 Epoch 9/10 120/120 [==============================] - 423s 4s/step - loss: 0.6039 - accuracy: 0.6852 - val_loss: 0.5774 - val_accuracy: 0.6896 Epoch 10/10 120/120 [==============================] - 412s 3s/step - loss: 0.5992 - accuracy: 0.6909 - val_loss: 0.5844 - val_accuracy: 0.6927
RESULTS = pd.DataFrame()
loss_1, accuracy_1 = model.evaluate(Test_generator)
RESULTS = RESULTS.append(pd.Series(["model",np.round((loss_1*100),2),np.round((accuracy_1*100),2)]),ignore_index=True)
loss_2, accuracy_2 = model_2.evaluate(Test_generator)
RESULTS = RESULTS.append(pd.Series(["model_2",np.round((loss_2*100),2),np.round((accuracy_2*100),2)]),ignore_index=True)
loss_2a, accuracy_2a = model_2a.evaluate(Test_generator)
RESULTS = RESULTS.append(pd.Series(["model_2a",np.round((loss_2a*100),2),np.round((accuracy_2a*100),2)]),ignore_index=True)
loss_3, accuracy_3 = model_3.evaluate(Test_generator)
RESULTS = RESULTS.append(pd.Series(["model_3",np.round((loss_3*100),2),np.round((accuracy_3*100),2)]),ignore_index=True)
RESULTS.columns = ['MODEL_NAME','LOSS','ACCURACY']
RESULTS.style.set_properties(**{'border': '1.3px solid black','color': 'blue'}).hide_index()
38/38 [==============================] - 17s 434ms/step - loss: 295867.4688 - accuracy: 0.6333 38/38 [==============================] - 10s 273ms/step - loss: 14534.3945 - accuracy: 0.5000 38/38 [==============================] - 25s 639ms/step - loss: 145273.4219 - accuracy: 0.6142 38/38 [==============================] - 19s 486ms/step - loss: 0.6062 - accuracy: 0.6867
| MODEL_NAME | LOSS | ACCURACY |
|---|---|---|
| model | 29586746.880000 | 63.330000 |
| model_2 | 1453439.450000 | 50.000000 |
| model_2a | 14527342.190000 | 61.420000 |
| model_3 | 60.620000 | 68.670000 |
RESULTS.to_csv("Using_Image_data_generator.csv",index = False)
X_train10k, X_test10k, y_train10k, y_test10k = train_test_split(X_tar_10k, y1_tar_cat_10k, test_size=.30,
stratify=y1_tar_cat_10k, random_state=1) # 70% Training and 20% Testing
tf.keras.backend.clear_session()
# second model
model_2a = Sequential()
model_2a.add(Conv2D(filters=32, kernel_size=5, strides=(1, 1), activation="relu", input_shape=(224, 224, 3)))
model_2a.add(MaxPooling2D(pool_size=(2, 2)))
model_2a.add(BatchNormalization())
model_2a.add(Conv2D(filters=64, kernel_size=5, strides=(1, 1), activation="relu"))
model_2a.add(MaxPooling2D(pool_size=(2, 2)))
model_2a.add(BatchNormalization())
model_2a.add(Conv2D(filters=128, kernel_size=3, strides=(1, 1), activation="relu"))
model_2a.add(MaxPooling2D(pool_size=(2, 2)))
model_2a.add(BatchNormalization())
model_2a.add(Dropout(rate=0.2))
model_2a.add(Conv2D(filters=256, kernel_size=3, strides=(1, 1), activation="relu"))
model_2a.add(MaxPooling2D(pool_size=(2, 2)))
model_2a.add(BatchNormalization())
model_2a.add(Dropout(rate=0.2))
#model_2a.add(Conv2D(filters=512, kernel_size=3, strides=(1, 1), activation="relu"))
#model_2a.add(MaxPooling2D(pool_size=(2, 2)))
#model_2a.add(BatchNormalization())
model_2a.add(Flatten())
#model_2a.add(Dense(512, activation="relu"))
# model_2a.add(Dropout(rate=0.7))
model_2a.add(Dense(256, activation="relu"))
model_2a.add(Dropout(rate=0.25))
model_2a.add(Dense(128, activation="relu"))
# model_2a.add(Dropout(rate=0.3))
model_2a.add(Dense(64, activation="relu"))
model_2a.add(Dense(32, activation="relu"))
model_2a.add(Dense(2, activation="sigmoid"))
optimizer = Adam(learning_rate=0.01)
model_2a.compile(optimizer = optimizer , loss = "binary_crossentropy", metrics=["accuracy"])
model_2a.summary()
Model: "sequential"
_________________________________________________________________
Layer (type) Output Shape Param #
=================================================================
conv2d (Conv2D) (None, 220, 220, 32) 2432
max_pooling2d (MaxPooling2D (None, 110, 110, 32) 0
)
batch_normalization (BatchN (None, 110, 110, 32) 128
ormalization)
conv2d_1 (Conv2D) (None, 106, 106, 64) 51264
max_pooling2d_1 (MaxPooling (None, 53, 53, 64) 0
2D)
batch_normalization_1 (Batc (None, 53, 53, 64) 256
hNormalization)
conv2d_2 (Conv2D) (None, 51, 51, 128) 73856
max_pooling2d_2 (MaxPooling (None, 25, 25, 128) 0
2D)
batch_normalization_2 (Batc (None, 25, 25, 128) 512
hNormalization)
dropout (Dropout) (None, 25, 25, 128) 0
conv2d_3 (Conv2D) (None, 23, 23, 256) 295168
max_pooling2d_3 (MaxPooling (None, 11, 11, 256) 0
2D)
batch_normalization_3 (Batc (None, 11, 11, 256) 1024
hNormalization)
dropout_1 (Dropout) (None, 11, 11, 256) 0
flatten (Flatten) (None, 30976) 0
dense (Dense) (None, 256) 7930112
dropout_2 (Dropout) (None, 256) 0
dense_1 (Dense) (None, 128) 32896
dense_2 (Dense) (None, 64) 8256
dense_3 (Dense) (None, 32) 2080
dense_4 (Dense) (None, 2) 66
=================================================================
Total params: 8,398,050
Trainable params: 8,397,090
Non-trainable params: 960
_________________________________________________________________
batch_size = 70
nb_epochs = 20
history_final = model_2a.fit(X_train10k, y_train10k,
batch_size=batch_size,
epochs=nb_epochs,
initial_epoch=0,
callbacks=[EarlyStopping(monitor='loss', patience=2, min_delta=0.001)])
Epoch 1/20 100/100 [==============================] - 649s 7s/step - loss: 0.5757 - accuracy: 0.7131 Epoch 2/20 100/100 [==============================] - 338s 3s/step - loss: 0.5713 - accuracy: 0.7149 Epoch 3/20 100/100 [==============================] - 333s 3s/step - loss: 0.5633 - accuracy: 0.7191 Epoch 4/20 100/100 [==============================] - 3329s 34s/step - loss: 0.5385 - accuracy: 0.7367 Epoch 5/20 100/100 [==============================] - 339s 3s/step - loss: 0.5289 - accuracy: 0.7437 Epoch 6/20 100/100 [==============================] - 334s 3s/step - loss: 0.5240 - accuracy: 0.7459 Epoch 7/20 100/100 [==============================] - 332s 3s/step - loss: 0.5045 - accuracy: 0.7550 Epoch 8/20 100/100 [==============================] - 333s 3s/step - loss: 0.4956 - accuracy: 0.7616 Epoch 9/20 100/100 [==============================] - 332s 3s/step - loss: 0.4948 - accuracy: 0.7636 Epoch 10/20 100/100 [==============================] - 297s 3s/step - loss: 0.5021 - accuracy: 0.7606
model_2a.evaluate(X_test10k,y_test10k)
94/94 [==============================] - 19s 200ms/step - loss: 0.5642 - accuracy: 0.7340
[0.5642110109329224, 0.734000027179718]
y_pred_2a = model_2a.predict(X_test10k)
y_test_arg=np.argmax(y_test10k,axis=1)
Y_pred_2a = np.argmax(y_pred_2a,axis=1)
print(confusion_matrix(y_test_arg, Y_pred_2a))
94/94 [==============================] - 20s 204ms/step [[ 985 515] [ 283 1217]]
print(classification_report(y_test_arg, Y_pred_2a))
precision recall f1-score support
0 0.78 0.66 0.71 1500
1 0.70 0.81 0.75 1500
accuracy 0.73 3000
macro avg 0.74 0.73 0.73 3000
weighted avg 0.74 0.73 0.73 3000
cm = confusion_matrix(y_test_arg, Y_pred_2a)
labels = ["0 - No Pneumonia", "1 - Pneumonia"]
disp = ConfusionMatrixDisplay(confusion_matrix=cm, display_labels=labels)
disp.plot()
plt.show()
**************************** END OF FINE TUNING ****************************
* Transfer Learning:
VGG-16
RESTNET-50
INCEPTION-V3
DENSENET 121 WITH IMAGENET WEIGHTS
DENSENET 121 WITH CHEXNET WEIGHTS
* Data Used:
X_train1 - Training images dataset containing 3840 dicom images of size 224x224x3
Y_train1 - Target class to 3840 train data
X_test - Testing images dataset containing 1200 dicom images of size 224x224x3
Y_test - Target class to 1200 test data
X_val - Validation images dataset containing 960 dicom images of size 224x224x3
Y_val - Target class to 960 validation data
train_generator - Image augmentation done on 3840 training images
Test_generator - Image augmentation done on 1200 testing images
Val_generator - Image augmentation done on 960 validation images
model_vgg = VGG16(input_shape=(224,224,3), weights='imagenet', include_top=False)
print(model_vgg.summary())
Model: "vgg16"
_________________________________________________________________
Layer (type) Output Shape Param #
=================================================================
input_1 (InputLayer) [(None, 224, 224, 3)] 0
block1_conv1 (Conv2D) (None, 224, 224, 64) 1792
block1_conv2 (Conv2D) (None, 224, 224, 64) 36928
block1_pool (MaxPooling2D) (None, 112, 112, 64) 0
block2_conv1 (Conv2D) (None, 112, 112, 128) 73856
block2_conv2 (Conv2D) (None, 112, 112, 128) 147584
block2_pool (MaxPooling2D) (None, 56, 56, 128) 0
block3_conv1 (Conv2D) (None, 56, 56, 256) 295168
block3_conv2 (Conv2D) (None, 56, 56, 256) 590080
block3_conv3 (Conv2D) (None, 56, 56, 256) 590080
block3_pool (MaxPooling2D) (None, 28, 28, 256) 0
block4_conv1 (Conv2D) (None, 28, 28, 512) 1180160
block4_conv2 (Conv2D) (None, 28, 28, 512) 2359808
block4_conv3 (Conv2D) (None, 28, 28, 512) 2359808
block4_pool (MaxPooling2D) (None, 14, 14, 512) 0
block5_conv1 (Conv2D) (None, 14, 14, 512) 2359808
block5_conv2 (Conv2D) (None, 14, 14, 512) 2359808
block5_conv3 (Conv2D) (None, 14, 14, 512) 2359808
block5_pool (MaxPooling2D) (None, 7, 7, 512) 0
=================================================================
Total params: 14,714,688
Trainable params: 14,714,688
Non-trainable params: 0
_________________________________________________________________
None
# don't train existing weights
for layer in model_vgg.layers:
layer.trainable = False
# our layers
x = Flatten()(model_vgg.output)
# Add a fully connected layer with 512 hidden units and ReLU activation
x = Dense(128, kernel_initializer='he_normal', activation='relu')(x)
x = Dense(84, kernel_initializer='he_normal', activation='relu')(x)
x = Dense(42, kernel_initializer='he_normal', activation='relu')(x)
# Add a dropout rate of 0.5
x = Dropout(0.3)(x)
# Add a final sigmoid layer with 1 node for classification output
x = Dense(2, activation='sigmoid')(x)
model_vgg_1 = tf.keras.models.Model(model_vgg.input, x)
model_vgg_1.compile(optimizer = tf.keras.optimizers.RMSprop(lr=0.0001), loss = 'binary_crossentropy',metrics = ['acc'])
print(x)
# x = Dense(1000, activation='relu')(x)
# prediction = Dense(2, activation='sigmoid')(x)
KerasTensor(type_spec=TensorSpec(shape=(None, 2), dtype=tf.float32, name=None), name='dense_12/Sigmoid:0', description="created by layer 'dense_12'")
#Using pixel arrays
batch_size = 60
nb_epochs = 10
history_vggi = model_vgg_1.fit(X_train1,y_train1,
batch_size=batch_size,
epochs=nb_epochs,
validation_data=(X_val,y_val),
initial_epoch=0)
Epoch 1/10 64/64 [==============================] - 313s 5s/step - loss: 0.3477 - acc: 0.8586 - val_loss: 0.7265 - val_acc: 0.6865 Epoch 2/10 64/64 [==============================] - 339s 5s/step - loss: 0.2760 - acc: 0.8948 - val_loss: 0.8873 - val_acc: 0.7198 Epoch 3/10 64/64 [==============================] - 338s 5s/step - loss: 0.2010 - acc: 0.9253 - val_loss: 0.9280 - val_acc: 0.7208 Epoch 4/10 64/64 [==============================] - 338s 5s/step - loss: 0.1821 - acc: 0.9365 - val_loss: 0.9055 - val_acc: 0.7146 Epoch 5/10 64/64 [==============================] - 338s 5s/step - loss: 0.1453 - acc: 0.9471 - val_loss: 0.9951 - val_acc: 0.7156 Epoch 6/10 64/64 [==============================] - 338s 5s/step - loss: 0.1064 - acc: 0.9628 - val_loss: 1.1589 - val_acc: 0.7167 Epoch 7/10 64/64 [==============================] - 337s 5s/step - loss: 0.0915 - acc: 0.9703 - val_loss: 1.1338 - val_acc: 0.7198 Epoch 8/10 64/64 [==============================] - 340s 5s/step - loss: 0.0821 - acc: 0.9753 - val_loss: 1.4085 - val_acc: 0.7260 Epoch 9/10 64/64 [==============================] - 338s 5s/step - loss: 0.0756 - acc: 0.9745 - val_loss: 1.3455 - val_acc: 0.7115 Epoch 10/10 64/64 [==============================] - 338s 5s/step - loss: 0.0491 - acc: 0.9883 - val_loss: 1.6710 - val_acc: 0.7365
fig1 = plt.gcf()
plt.plot(history_vggi.history['acc'])
plt.plot(history_vggi.history['val_acc'])
plt.axis(ymin=0.4,ymax=1)
plt.grid()
plt.title('Model Accuracy')
plt.ylabel('Accuracy')
plt.xlabel('Epochs')
plt.legend(['train', 'validation'])
plt.show()
fig1 = plt.gcf()
plt.plot(history_vggi.history['loss'])
plt.plot(history_vggi.history['val_loss'])
plt.axis(ymin=0.4,ymax=1)
plt.grid()
plt.title('Model Loss')
plt.ylabel('Loss')
plt.xlabel('Epochs')
plt.legend(['train', 'validation'])
plt.show()
loss_trainvgg, accuracy_trainvgg = model_vgg_1.evaluate(X_train1,y_train1)
loss_valvgg, accuracy_valvgg = model_vgg_1.evaluate(X_val,y_val)
loss_testvgg, accuracy_testvgg = model_vgg_1.evaluate(X_test,y_test)
120/120 [==============================] - 214s 2s/step - loss: 0.0086 - acc: 0.9990 30/30 [==============================] - 68s 2s/step - loss: 1.6710 - acc: 0.7365 38/38 [==============================] - 70s 2s/step - loss: 1.6251 - acc: 0.7125
y_pred_vgg = model_vgg_1.predict(X_test)
38/38 [==============================] - 69s 2s/step
y_test_arg_vgg = np.argmax(y_test,axis=1)
y_pred_vgg = np.argmax(y_pred_vgg,axis=1)
print(confusion_matrix(y_test_arg_vgg, y_pred_vgg))
[[416 184] [161 439]]
Accuracy =metrics.accuracy_score(y_test_arg_vgg,y_pred_vgg)
Precision=metrics.precision_score(y_test_arg_vgg, y_pred_vgg)
Recall =metrics.recall_score(y_test_arg_vgg, y_pred_vgg)
F1_score =metrics.f1_score(y_test_arg_vgg, y_pred_vgg)
ROC =metrics.roc_auc_score(y_test_arg_vgg, y_pred_vgg)
Pre_trained_results = pd.DataFrame()
vgg_series = pd.Series(["VGG 16", Accuracy, Precision, Recall, F1_score, ROC])
Pre_trained_results = Pre_trained_results.append(vgg_series, ignore_index = True)
#Using Imagedatagenerator for train and val
STEP_SIZE_TRAIN=round_up(train_generator.n/train_generator.batch_size)
STEP_SIZE_VALID=round_up(valid_generator.n/valid_generator.batch_size)
history_vggi = model_vgg_1.fit_generator(
generator=train_generator,
steps_per_epoch=STEP_SIZE_TRAIN,
validation_data=valid_generator,
validation_steps=STEP_SIZE_VALID,
epochs=15)
Epoch 1/15 120/120 [==============================] - 311s 3s/step - loss: 1.4532 - acc: 0.6281 - val_loss: 0.6072 - val_acc: 0.7208 Epoch 2/15 120/120 [==============================] - 351s 3s/step - loss: 0.6461 - acc: 0.6867 - val_loss: 0.5692 - val_acc: 0.7469 Epoch 3/15 120/120 [==============================] - 350s 3s/step - loss: 0.6244 - acc: 0.6964 - val_loss: 0.5626 - val_acc: 0.7542 Epoch 4/15 120/120 [==============================] - 332s 3s/step - loss: 0.6046 - acc: 0.7146 - val_loss: 0.5542 - val_acc: 0.7469 Epoch 5/15 120/120 [==============================] - 340s 3s/step - loss: 0.5996 - acc: 0.7169 - val_loss: 0.5891 - val_acc: 0.7437 Epoch 6/15 120/120 [==============================] - 354s 3s/step - loss: 0.6023 - acc: 0.7359 - val_loss: 0.5317 - val_acc: 0.7615 Epoch 7/15 120/120 [==============================] - 294s 2s/step - loss: 0.5859 - acc: 0.7318 - val_loss: 0.5500 - val_acc: 0.7646 Epoch 8/15 120/120 [==============================] - 278s 2s/step - loss: 0.6011 - acc: 0.7302 - val_loss: 0.5882 - val_acc: 0.7708 Epoch 9/15 120/120 [==============================] - 278s 2s/step - loss: 0.5755 - acc: 0.7328 - val_loss: 0.5498 - val_acc: 0.7708 Epoch 10/15 120/120 [==============================] - 279s 2s/step - loss: 0.5627 - acc: 0.7258 - val_loss: 0.5462 - val_acc: 0.7677 Epoch 11/15 120/120 [==============================] - 278s 2s/step - loss: 0.5538 - acc: 0.7357 - val_loss: 0.6101 - val_acc: 0.7677 Epoch 12/15 120/120 [==============================] - 278s 2s/step - loss: 0.5675 - acc: 0.7437 - val_loss: 0.5402 - val_acc: 0.7615 Epoch 13/15 120/120 [==============================] - 277s 2s/step - loss: 0.5620 - acc: 0.7474 - val_loss: 0.5373 - val_acc: 0.7531 Epoch 14/15 120/120 [==============================] - 279s 2s/step - loss: 0.5668 - acc: 0.7411 - val_loss: 0.5501 - val_acc: 0.7677 Epoch 15/15 120/120 [==============================] - 277s 2s/step - loss: 0.5623 - acc: 0.7310 - val_loss: 0.5121 - val_acc: 0.7667
fig1 = plt.gcf()
plt.plot(history_vggi.history['acc'])
plt.plot(history_vggi.history['val_acc'])
plt.axis(ymin=0.4,ymax=1)
plt.grid()
plt.title('Model Accuracy')
plt.ylabel('Accuracy')
plt.xlabel('Epochs')
plt.legend(['train', 'validation'])
plt.show()
fig1 = plt.gcf()
plt.plot(history_vggi.history['loss'])
plt.plot(history_vggi.history['val_loss'])
plt.axis(ymin=0.4,ymax=1)
plt.grid()
plt.title('Model Loss')
plt.ylabel('Loss')
plt.xlabel('Epochs')
plt.legend(['train', 'validation'])
plt.show()
loss_trainvgg, accuracy_trainvgg = model_vgg_1.evaluate(train_generator)
loss_valvgg, accuracy_valvgg = model_vgg_1.evaluate(valid_generator)
loss_testvgg, accuracy_testvgg = model_vgg_1.evaluate(Test_generator)
120/120 [==============================] - 245s 2s/step - loss: 0.5128 - acc: 0.7646 30/30 [==============================] - 69s 2s/step - loss: 0.5121 - acc: 0.7667 38/38 [==============================] - 86s 2s/step - loss: 0.5654 - acc: 0.7425
y_pred_vgg = model_vgg_1.predict(X_test)
38/38 [==============================] - 70s 2s/step
y_test_arg_vgg = np.argmax(y_test,axis=1)
y_pred_vgg = np.argmax(y_pred_vgg,axis=1)
print(confusion_matrix(y_test_arg_vgg, y_pred_vgg))
[[495 105] [299 301]]
cm = confusion_matrix(y_test_arg_vgg, y_pred_vgg)
labels = ["0 - No Pneumonia", "1 - Pneumonia"]
disp = ConfusionMatrixDisplay(confusion_matrix=cm, display_labels=labels)
disp.plot()
plt.show()
Accuracy =metrics.accuracy_score(y_test_arg_vgg,y_pred_vgg)
Precision=metrics.precision_score(y_test_arg_vgg, y_pred_vgg)
Recall =metrics.recall_score(y_test_arg_vgg, y_pred_vgg)
F1_score =metrics.f1_score(y_test_arg_vgg, y_pred_vgg)
ROC =metrics.roc_auc_score(y_test_arg_vgg, y_pred_vgg)
vgg_series = pd.Series(["VGG 16 Using IDG", Accuracy, Precision, Recall, F1_score, ROC])
Pre_trained_results = Pre_trained_results.append(vgg_series, ignore_index = True)
print("VGG16 - Accuracy: ",Accuracy)
print("VGG16 - Precision: ",Precision)
print("VGG16 - Recall: ",Recall)
print("VGG16 - F1 score: ",F1_score)
print("VGG16 - ROC: ",ROC)
VGG16 - Accuracy: 0.6633333333333333 VGG16 - Precision: 0.7413793103448276 VGG16 - Recall: 0.5016666666666667 VGG16 - F1 score: 0.5984095427435389 VGG16 - ROC: 0.6633333333333333
resnet_model_1 = Sequential()
pretrained_model= tf.keras.applications.ResNet50(include_top=False,
input_shape=(224,224,3), pooling='avg',classes=2, weights='imagenet')
for layer in pretrained_model.layers:
layer.trainable=False
resnet_model_1.add(pretrained_model)
resnet_model_1.add(Flatten())
resnet_model_1.add(Dense(128, kernel_initializer='he_normal', activation='relu'))
resnet_model_1.add(Dense(64, kernel_initializer='he_normal', activation='relu'))
resnet_model_1.add(Dense(32, kernel_initializer='he_normal', activation='relu'))
resnet_model_1.add(Dense(2, kernel_initializer='he_normal', activation='sigmoid'))
resnet_model_1.summary()
Model: "sequential_1"
_________________________________________________________________
Layer (type) Output Shape Param #
=================================================================
resnet50 (Functional) (None, 2048) 23587712
flatten_3 (Flatten) (None, 2048) 0
dense_13 (Dense) (None, 128) 262272
dense_14 (Dense) (None, 64) 8256
dense_15 (Dense) (None, 32) 2080
dense_16 (Dense) (None, 2) 66
=================================================================
Total params: 23,860,386
Trainable params: 272,674
Non-trainable params: 23,587,712
_________________________________________________________________
resnet_model_1.compile(optimizer=Adam(learning_rate=0.00001),loss='binary_crossentropy',metrics=['acc'])
batch_size = 80
nb_epochs = 10
resnet_model_1_history = resnet_model_1.fit(X_train1, y_train1,
batch_size=batch_size,
epochs=nb_epochs,
validation_data=(X_val, y_val),
initial_epoch=0, callbacks=[EarlyStopping(monitor='val_loss', patience=2, min_delta=0.001)])
Epoch 1/10 48/48 [==============================] - 187s 4s/step - loss: 0.7084 - acc: 0.5417 - val_loss: 0.6611 - val_acc: 0.6177 Epoch 2/10 48/48 [==============================] - 184s 4s/step - loss: 0.6440 - acc: 0.6461 - val_loss: 0.6257 - val_acc: 0.6906 Epoch 3/10 48/48 [==============================] - 193s 4s/step - loss: 0.6156 - acc: 0.6878 - val_loss: 0.6010 - val_acc: 0.6990 Epoch 4/10 48/48 [==============================] - 187s 4s/step - loss: 0.5956 - acc: 0.7068 - val_loss: 0.5845 - val_acc: 0.7000 Epoch 5/10 48/48 [==============================] - 187s 4s/step - loss: 0.5827 - acc: 0.7122 - val_loss: 0.5748 - val_acc: 0.7031 Epoch 6/10 48/48 [==============================] - 181s 4s/step - loss: 0.5742 - acc: 0.7211 - val_loss: 0.5684 - val_acc: 0.7052 Epoch 7/10 48/48 [==============================] - 182s 4s/step - loss: 0.5675 - acc: 0.7190 - val_loss: 0.5627 - val_acc: 0.7104 Epoch 8/10 48/48 [==============================] - 181s 4s/step - loss: 0.5627 - acc: 0.7214 - val_loss: 0.5584 - val_acc: 0.7125 Epoch 9/10 48/48 [==============================] - 181s 4s/step - loss: 0.5575 - acc: 0.7224 - val_loss: 0.5551 - val_acc: 0.7146 Epoch 10/10 48/48 [==============================] - 181s 4s/step - loss: 0.5537 - acc: 0.7260 - val_loss: 0.5520 - val_acc: 0.7188
fig1 = plt.gcf()
plt.plot(resnet_model_1_history.history['acc'])
plt.plot(resnet_model_1_history.history['val_acc'])
plt.axis(ymin=0.4,ymax=1)
plt.grid()
plt.title('Model Accuracy')
plt.ylabel('Accuracy')
plt.xlabel('Epochs')
plt.legend(['train', 'validation'])
plt.show()
fig1 = plt.gcf()
plt.plot(resnet_model_1_history.history['loss'])
plt.plot(resnet_model_1_history.history['val_loss'])
plt.axis(ymin=0.4,ymax=1)
plt.grid()
plt.title('Model Loss')
plt.ylabel('Loss')
plt.xlabel('Epochs')
plt.legend(['train', 'validation'])
plt.show()
loss_trainrn1, accuracy_trainrn1 = resnet_model_1.evaluate(X_train1,y_train1)
loss_valrn1, accuracy_valrn1 = resnet_model_1.evaluate(X_val,y_val)
loss_testrn1, accuracy_testrn1 = resnet_model_1.evaluate(X_test,y_test)
120/120 [==============================] - 147s 1s/step - loss: 0.5508 - acc: 0.7279 30/30 [==============================] - 37s 1s/step - loss: 0.5520 - acc: 0.7188 38/38 [==============================] - 46s 1s/step - loss: 0.5648 - acc: 0.7183
y_pred_rn1 = resnet_model_1.predict(X_test)
38/38 [==============================] - 47s 1s/step
y_test_arg_rn1 = np.argmax(y_test,axis=1)
y_pred_rn1 = np.argmax(y_pred_rn1,axis=1)
print(confusion_matrix(y_test_arg_rn1, y_pred_rn1))
[[416 184] [154 446]]
Accuracy =metrics.accuracy_score(y_test_arg_rn1,y_pred_rn1)
Precision=metrics.precision_score(y_test_arg_rn1, y_pred_rn1)
Recall =metrics.recall_score(y_test_arg_rn1, y_pred_rn1)
F1_score =metrics.f1_score(y_test_arg_rn1, y_pred_rn1)
ROC =metrics.roc_auc_score(y_test_arg_rn1, y_pred_rn1)
rn1_series = pd.Series(["RESTNET50", Accuracy, Precision, Recall, F1_score, ROC])
Pre_trained_results = Pre_trained_results.append(rn1_series, ignore_index = True)
print("RESTNET 50 - Accuracy: ",Accuracy)
print("RESTNET 50 - Precision: ",Precision)
print("RESTNET 50 - Recall: ",Recall)
print("RESTNET 50 - F1 score: ",F1_score)
print("RESTNET 50 - ROC: ",ROC)
RESTNET 50 - Accuracy: 0.7183333333333334 RESTNET 50 - Precision: 0.707936507936508 RESTNET 50 - Recall: 0.7433333333333333 RESTNET 50 - F1 score: 0.7252032520325203 RESTNET 50 - ROC: 0.7183333333333333
history_resneti = resnet_model_1.fit_generator(
generator=train_generator,
steps_per_epoch=STEP_SIZE_TRAIN,
validation_data=valid_generator,
validation_steps=STEP_SIZE_VALID,
epochs=15)
Epoch 1/15 120/120 [==============================] - 241s 2s/step - loss: 0.6205 - acc: 0.6763 - val_loss: 0.5518 - val_acc: 0.7323 Epoch 2/15 120/120 [==============================] - 235s 2s/step - loss: 0.5708 - acc: 0.7273 - val_loss: 0.5348 - val_acc: 0.7406 Epoch 3/15 120/120 [==============================] - 232s 2s/step - loss: 0.5548 - acc: 0.7318 - val_loss: 0.5282 - val_acc: 0.7469 Epoch 4/15 120/120 [==============================] - 233s 2s/step - loss: 0.5492 - acc: 0.7349 - val_loss: 0.5253 - val_acc: 0.7500 Epoch 5/15 120/120 [==============================] - 232s 2s/step - loss: 0.5458 - acc: 0.7398 - val_loss: 0.5181 - val_acc: 0.7531 Epoch 6/15 120/120 [==============================] - 231s 2s/step - loss: 0.5341 - acc: 0.7451 - val_loss: 0.5192 - val_acc: 0.7500 Epoch 7/15 120/120 [==============================] - 232s 2s/step - loss: 0.5311 - acc: 0.7497 - val_loss: 0.5096 - val_acc: 0.7552 Epoch 8/15 120/120 [==============================] - 233s 2s/step - loss: 0.5341 - acc: 0.7422 - val_loss: 0.5099 - val_acc: 0.7521 Epoch 9/15 120/120 [==============================] - 233s 2s/step - loss: 0.5252 - acc: 0.7555 - val_loss: 0.5031 - val_acc: 0.7552 Epoch 10/15 120/120 [==============================] - 231s 2s/step - loss: 0.5281 - acc: 0.7445 - val_loss: 0.5113 - val_acc: 0.7542 Epoch 11/15 120/120 [==============================] - 231s 2s/step - loss: 0.5270 - acc: 0.7589 - val_loss: 0.5116 - val_acc: 0.7542 Epoch 12/15 120/120 [==============================] - 232s 2s/step - loss: 0.5171 - acc: 0.7557 - val_loss: 0.5100 - val_acc: 0.7583 Epoch 13/15 120/120 [==============================] - 232s 2s/step - loss: 0.5279 - acc: 0.7430 - val_loss: 0.5021 - val_acc: 0.7521 Epoch 14/15 120/120 [==============================] - 231s 2s/step - loss: 0.5188 - acc: 0.7482 - val_loss: 0.5129 - val_acc: 0.7594 Epoch 15/15 120/120 [==============================] - 231s 2s/step - loss: 0.5194 - acc: 0.7581 - val_loss: 0.5008 - val_acc: 0.7594
fig1 = plt.gcf()
plt.plot(history_resneti.history['acc'])
plt.plot(history_resneti.history['val_acc'])
plt.axis(ymin=0.4,ymax=1)
plt.grid()
plt.title('Model Accuracy')
plt.ylabel('Accuracy')
plt.xlabel('Epochs')
plt.legend(['train', 'validation'])
plt.show()
fig1 = plt.gcf()
plt.plot(history_resneti.history['loss'])
plt.plot(history_resneti.history['val_loss'])
plt.axis(ymin=0.4,ymax=1)
plt.grid()
plt.title('Model Loss')
plt.ylabel('Loss')
plt.xlabel('Epochs')
plt.legend(['train', 'validation'])
plt.show()
loss_trainrn2, accuracy_trainrn2 = resnet_model_1.evaluate(train_generator)
loss_valrn2, accuracy_valrn2 = resnet_model_1.evaluate(valid_generator)
loss_testrn2, accuracy_testrn2 = resnet_model_1.evaluate(Test_generator)
120/120 [==============================] - 176s 1s/step - loss: 0.5122 - acc: 0.7563 30/30 [==============================] - 43s 1s/step - loss: 0.5008 - acc: 0.7594 38/38 [==============================] - 53s 1s/step - loss: 0.5320 - acc: 0.7367
y_pred_rn2 = resnet_model_1.predict(X_test)
38/38 [==============================] - 51s 1s/step
y_test_arg_rn2 = np.argmax(y_test,axis=1)
y_pred_rn2 = np.argmax(y_pred_rn2,axis=1)
print(confusion_matrix(y_test_arg_rn2, y_pred_rn2))
[[441 159] [183 417]]
Accuracy =metrics.accuracy_score(y_test_arg_rn2,y_pred_rn2)
Precision=metrics.precision_score(y_test_arg_rn2, y_pred_rn2)
Recall =metrics.recall_score(y_test_arg_rn2, y_pred_rn2)
F1_score =metrics.f1_score(y_test_arg_rn2, y_pred_rn2)
ROC =metrics.roc_auc_score(y_test_arg_rn2, y_pred_rn2)
rn2_series = pd.Series(["RESTNET50_IMAGE DATA GENERATOR", Accuracy, Precision, Recall, F1_score, ROC])
Pre_trained_results = Pre_trained_results.append(rn2_series, ignore_index = True)
print("RESTNET 50 - Accuracy: ",Accuracy)
print("RESTNET 50 - Precision: ",Precision)
print("RESTNET 50 - Recall: ",Recall)
print("RESTNET 50 - F1 score: ",F1_score)
print("RESTNET 50 - ROC: ",ROC)
RESTNET 50 - Accuracy: 0.715 RESTNET 50 - Precision: 0.7239583333333334 RESTNET 50 - Recall: 0.695 RESTNET 50 - F1 score: 0.7091836734693878 RESTNET 50 - ROC: 0.715
incep_base_model = tf.keras.applications.InceptionV3(weights='imagenet',
include_top=False,
input_shape=(224, 224,3))
incep_base_model.trainable = False
add_model = Sequential()
add_model.add(incep_base_model)
add_model.add(GlobalAveragePooling2D())
add_model.add(Dropout(0.5))
add_model.add(Dense(2,
activation='sigmoid'))
incep_model_1 = add_model
incep_model_1.compile(loss='binary_crossentropy',
optimizer=Adam(lr=1e-4),
metrics=['accuracy'])
incep_model_1.summary()
Downloading data from https://storage.googleapis.com/tensorflow/keras-applications/inception_v3/inception_v3_weights_tf_dim_ordering_tf_kernels_notop.h5
87910968/87910968 [==============================] - 23s 0us/step
Model: "sequential_2"
_________________________________________________________________
Layer (type) Output Shape Param #
=================================================================
inception_v3 (Functional) (None, 5, 5, 2048) 21802784
global_average_pooling2d (G (None, 2048) 0
lobalAveragePooling2D)
dropout_5 (Dropout) (None, 2048) 0
dense_17 (Dense) (None, 2) 4098
=================================================================
Total params: 21,806,882
Trainable params: 4,098
Non-trainable params: 21,802,784
_________________________________________________________________
batch_size = 80
nb_epochs = 10
incep_hist_1 = incep_model_1.fit(X_train1, y_train1,
batch_size=batch_size,
epochs=nb_epochs,
validation_data=(X_val, y_val),
initial_epoch=0, callbacks=[EarlyStopping(monitor='val_loss', patience=2, min_delta=0.001)])
Epoch 1/10 48/48 [==============================] - 95s 2s/step - loss: 6.3972 - accuracy: 0.5049 - val_loss: 1.6682 - val_accuracy: 0.5740 Epoch 2/10 48/48 [==============================] - 91s 2s/step - loss: 5.5562 - accuracy: 0.5440 - val_loss: 1.6705 - val_accuracy: 0.6125 Epoch 3/10 48/48 [==============================] - 92s 2s/step - loss: 5.2879 - accuracy: 0.5643 - val_loss: 1.7044 - val_accuracy: 0.6385
fig1 = plt.gcf()
plt.plot(incep_hist_1.history['accuracy'])
plt.plot(incep_hist_1.history['val_accuracy'])
plt.axis(ymin=0.4,ymax=1)
plt.grid()
plt.title('Model Accuracy')
plt.ylabel('Accuracy')
plt.xlabel('Epochs')
plt.legend(['train', 'validation'])
plt.show()
loss_traininc1, accuracy_traininc1 = incep_model_1.evaluate(X_train1,y_train1)
loss_valinc1, accuracy_valinc1 = incep_model_1.evaluate(X_val,y_val)
loss_testinc1, accuracy_testinc1 = incep_model_1.evaluate(X_test,y_test)
120/120 [==============================] - 83s 683ms/step - loss: 1.6515 - accuracy: 0.6414 30/30 [==============================] - 22s 718ms/step - loss: 1.7044 - accuracy: 0.6385 38/38 [==============================] - 27s 702ms/step - loss: 1.7307 - accuracy: 0.6367
y_pred_inc1 = incep_model_1.predict(X_test)
38/38 [==============================] - 26s 664ms/step
y_test_arg_inc1 = np.argmax(y_test,axis=1)
y_pred_inc1 = np.argmax(y_pred_inc1,axis=1)
print(confusion_matrix(y_test_arg_inc1, y_pred_inc1))
[[414 186] [250 350]]
Accuracy =metrics.accuracy_score(y_test_arg_inc1,y_pred_inc1)
Precision=metrics.precision_score(y_test_arg_inc1, y_pred_inc1)
Recall =metrics.recall_score(y_test_arg_inc1, y_pred_inc1)
F1_score =metrics.f1_score(y_test_arg_inc1, y_pred_inc1)
ROC =metrics.roc_auc_score(y_test_arg_inc1, y_pred_inc1)
inc1_series = pd.Series(["INCEPTION V3", Accuracy, Precision, Recall, F1_score, ROC])
Pre_trained_results = Pre_trained_results.append(inc1_series, ignore_index = True)
print("INCEPTION V3 - Accuracy: ",Accuracy)
print("INCEPTION V3 - Precision: ",Precision)
print("INCEPTION V3 - Recall: ",Recall)
print("INCEPTION V3 - F1 score: ",F1_score)
print("INCEPTION V3 - ROC: ",ROC)
INCEPTION V3 - Accuracy: 0.6366666666666667 INCEPTION V3 - Precision: 0.6529850746268657 INCEPTION V3 - Recall: 0.5833333333333334 INCEPTION V3 - F1 score: 0.6161971830985915 INCEPTION V3 - ROC: 0.6366666666666667
history_inceptioni = incep_model_1.fit_generator(
generator=train_generator,
steps_per_epoch=STEP_SIZE_TRAIN,
validation_data=valid_generator,
validation_steps=STEP_SIZE_VALID,
epochs=15)
Epoch 1/15 120/120 [==============================] - 144s 1s/step - loss: 8.4178 - accuracy: 0.5672 - val_loss: 2.4637 - val_accuracy: 0.6781 Epoch 2/15 120/120 [==============================] - 143s 1s/step - loss: 7.4747 - accuracy: 0.5745 - val_loss: 2.3133 - val_accuracy: 0.6865 Epoch 3/15 120/120 [==============================] - 142s 1s/step - loss: 6.7021 - accuracy: 0.5966 - val_loss: 2.2380 - val_accuracy: 0.6958 Epoch 4/15 120/120 [==============================] - 142s 1s/step - loss: 6.2577 - accuracy: 0.5969 - val_loss: 2.1274 - val_accuracy: 0.6958 Epoch 5/15 120/120 [==============================] - 142s 1s/step - loss: 5.9501 - accuracy: 0.5852 - val_loss: 1.9410 - val_accuracy: 0.7094 Epoch 6/15 120/120 [==============================] - 141s 1s/step - loss: 5.4407 - accuracy: 0.6000 - val_loss: 1.7987 - val_accuracy: 0.7104 Epoch 7/15 120/120 [==============================] - 142s 1s/step - loss: 4.9343 - accuracy: 0.5945 - val_loss: 1.7518 - val_accuracy: 0.7094 Epoch 8/15 120/120 [==============================] - 141s 1s/step - loss: 4.3727 - accuracy: 0.6094 - val_loss: 1.5628 - val_accuracy: 0.7115 Epoch 9/15 120/120 [==============================] - 142s 1s/step - loss: 4.1116 - accuracy: 0.6164 - val_loss: 1.4833 - val_accuracy: 0.7094 Epoch 10/15 120/120 [==============================] - 142s 1s/step - loss: 4.1056 - accuracy: 0.5940 - val_loss: 1.4073 - val_accuracy: 0.7135 Epoch 11/15 120/120 [==============================] - 142s 1s/step - loss: 3.5564 - accuracy: 0.6130 - val_loss: 1.3399 - val_accuracy: 0.7156 Epoch 12/15 120/120 [==============================] - 141s 1s/step - loss: 3.5717 - accuracy: 0.5932 - val_loss: 1.2253 - val_accuracy: 0.7156 Epoch 13/15 120/120 [==============================] - 142s 1s/step - loss: 3.2935 - accuracy: 0.6031 - val_loss: 1.1730 - val_accuracy: 0.7229 Epoch 14/15 120/120 [==============================] - 141s 1s/step - loss: 3.1212 - accuracy: 0.6018 - val_loss: 1.0627 - val_accuracy: 0.7177 Epoch 15/15 120/120 [==============================] - 142s 1s/step - loss: 2.7746 - accuracy: 0.6177 - val_loss: 1.0201 - val_accuracy: 0.7146
fig1 = plt.gcf()
plt.plot(history_inceptioni.history['accuracy'])
plt.plot(history_inceptioni.history['val_accuracy'])
plt.axis(ymin=0.4,ymax=1)
plt.grid()
plt.title('Model Accuracy')
plt.ylabel('Accuracy')
plt.xlabel('Epochs')
plt.legend(['train', 'validation'])
plt.show()
loss_traininc2, accuracy_traininc2 = incep_model_1.evaluate(train_generator)
loss_valinc2, accuracy_valinc2 = incep_model_1.evaluate(valid_generator)
loss_testinc2, accuracy_testinc2 = incep_model_1.evaluate(Test_generator)
120/120 [==============================] - 118s 981ms/step - loss: 1.2315 - accuracy: 0.6802 30/30 [==============================] - 23s 755ms/step - loss: 1.0201 - accuracy: 0.7146 38/38 [==============================] - 28s 745ms/step - loss: 1.1289 - accuracy: 0.7083
y_pred_inc2 = incep_model_1.predict(X_test)
38/38 [==============================] - 25s 670ms/step
y_test_arg_inc2 = np.argmax(y_test,axis=1)
y_pred_inc2 = np.argmax(y_pred_inc2,axis=1)
print(confusion_matrix(y_test_arg_inc2, y_pred_inc2))
[[175 425] [ 72 528]]
Accuracy =metrics.accuracy_score(y_test_arg_inc2,y_pred_inc2)
Precision=metrics.precision_score(y_test_arg_inc2, y_pred_inc2)
Recall =metrics.recall_score(y_test_arg_inc2, y_pred_inc2)
F1_score =metrics.f1_score(y_test_arg_inc2, y_pred_inc2)
ROC =metrics.roc_auc_score(y_test_arg_inc2, y_pred_inc2)
inc2_series = pd.Series(["INCEPTION V3_IMAGE DATA GENERATOR", Accuracy, Precision, Recall, F1_score, ROC])
Pre_trained_results = Pre_trained_results.append(inc2_series, ignore_index = True)
print("INCEPTION V3 - Accuracy: ",Accuracy)
print("INCEPTION V3 - Precision: ",Precision)
print("INCEPTION V3 - Recall: ",Recall)
print("INCEPTION V3 - F1 score: ",F1_score)
print("INCEPTION V3 - ROC: ",ROC)
INCEPTION V3 - Accuracy: 0.5858333333333333 INCEPTION V3 - Precision: 0.5540398740818469 INCEPTION V3 - Recall: 0.88 INCEPTION V3 - F1 score: 0.6799742433998713 INCEPTION V3 - ROC: 0.5858333333333333
# Increase Trainable parameters in the model and retrain
add_model = Sequential()
add_model.add(incep_base_model)
add_model.add(GlobalAveragePooling2D())
# add_model.add(Dropout(0.3))
add_model.add(Dense(128, activation="relu"))
add_model.add(Dense(64, activation="relu"))
add_model.add(Dense(2,
activation='sigmoid'))
incep_model_2 = add_model
incep_model_2.compile(loss='binary_crossentropy',
optimizer=RMSprop(lr=0.0001),
metrics=['accuracy'])
incep_model_2.summary()
Model: "sequential_3"
_________________________________________________________________
Layer (type) Output Shape Param #
=================================================================
inception_v3 (Functional) (None, 5, 5, 2048) 21802784
global_average_pooling2d_1 (None, 2048) 0
(GlobalAveragePooling2D)
dense_18 (Dense) (None, 128) 262272
dense_19 (Dense) (None, 64) 8256
dense_20 (Dense) (None, 2) 130
=================================================================
Total params: 22,073,442
Trainable params: 270,658
Non-trainable params: 21,802,784
_________________________________________________________________
batch_size = 80
nb_epochs = 10
incep_hist_2 = incep_model_2.fit(X_train1, y_train1,
batch_size=batch_size,
epochs=nb_epochs,
validation_data=(X_val, y_val),
initial_epoch=0, callbacks=[EarlyStopping(monitor='val_loss', patience=2, min_delta=0.001)])
Epoch 1/10 48/48 [==============================] - 95s 2s/step - loss: 1.1268 - accuracy: 0.6010 - val_loss: 0.9605 - val_accuracy: 0.6594 Epoch 2/10 48/48 [==============================] - 90s 2s/step - loss: 0.8434 - accuracy: 0.6430 - val_loss: 0.6872 - val_accuracy: 0.6719 Epoch 3/10 48/48 [==============================] - 88s 2s/step - loss: 0.7823 - accuracy: 0.6617 - val_loss: 0.7902 - val_accuracy: 0.6354 Epoch 4/10 48/48 [==============================] - 88s 2s/step - loss: 0.7374 - accuracy: 0.6682 - val_loss: 0.6929 - val_accuracy: 0.6875
fig1 = plt.gcf()
plt.plot(incep_hist_2.history['accuracy'])
plt.plot(incep_hist_2.history['val_accuracy'])
plt.axis(ymin=0.4,ymax=1)
plt.grid()
plt.title('Model Accuracy')
plt.ylabel('Accuracy')
plt.xlabel('Epochs')
plt.legend(['train', 'validation'])
plt.show()
loss_traininc3, accuracy_traininc3 = incep_model_2.evaluate(train_generator)
loss_valinc3, accuracy_valinc3 = incep_model_2.evaluate(valid_generator)
loss_testinc3, accuracy_testinc3 = incep_model_2.evaluate(Test_generator)
120/120 [==============================] - 119s 980ms/step - loss: 1.8895 - accuracy: 0.5750 30/30 [==============================] - 22s 724ms/step - loss: 1.6969 - accuracy: 0.5948 38/38 [==============================] - 27s 718ms/step - loss: 1.7093 - accuracy: 0.5675
y_pred_inc3 = incep_model_2.predict(X_test)
38/38 [==============================] - 27s 695ms/step
y_test_arg_inc3 = np.argmax(y_test,axis=1)
y_pred_inc3 = np.argmax(y_pred_inc3,axis=1)
print(confusion_matrix(y_test_arg_inc3, y_pred_inc3))
[[412 188] [186 414]]
Accuracy =metrics.accuracy_score(y_test_arg_inc3,y_pred_inc3)
Precision=metrics.precision_score(y_test_arg_inc3, y_pred_inc3)
Recall =metrics.recall_score(y_test_arg_inc3, y_pred_inc3)
F1_score =metrics.f1_score(y_test_arg_inc3, y_pred_inc3)
ROC =metrics.roc_auc_score(y_test_arg_inc3, y_pred_inc3)
inc3_series = pd.Series(["INCEPTION V3 - ADDED LAYERS", Accuracy, Precision, Recall, F1_score, ROC])
Pre_trained_results = Pre_trained_results.append(inc3_series, ignore_index = True)
print("INCEPTION V3 - Accuracy: ",Accuracy)
print("INCEPTION V3 - Precision: ",Precision)
print("INCEPTION V3 - Recall: ",Recall)
print("INCEPTION V3 - F1 score: ",F1_score)
print("INCEPTION V3 - ROC: ",ROC)
INCEPTION V3 - Accuracy: 0.6883333333333334 INCEPTION V3 - Precision: 0.6877076411960132 INCEPTION V3 - Recall: 0.69 INCEPTION V3 - F1 score: 0.6888519134775374 INCEPTION V3 - ROC: 0.6883333333333332
cm = confusion_matrix(y_test_arg_inc3, y_pred_inc3)
labels = ["0 - No Pneumonia", "1 - Pneumonia"]
disp = ConfusionMatrixDisplay(confusion_matrix=cm, display_labels=labels)
disp.plot()
plt.show()
dense_base_model = tf.keras.applications.DenseNet121(
include_top=False,
weights="imagenet",
input_shape=(224,224,3),
pooling="avg"
)
# don't train existing weights
for layer in dense_base_model.layers:
layer.trainable = False
dense_base_model.summary()
Model: "densenet121"
__________________________________________________________________________________________________
Layer (type) Output Shape Param # Connected to
==================================================================================================
input_4 (InputLayer) [(None, 224, 224, 3 0 []
)]
zero_padding2d (ZeroPadding2D) (None, 230, 230, 3) 0 ['input_4[0][0]']
conv1/conv (Conv2D) (None, 112, 112, 64 9408 ['zero_padding2d[0][0]']
)
conv1/bn (BatchNormalization) (None, 112, 112, 64 256 ['conv1/conv[0][0]']
)
conv1/relu (Activation) (None, 112, 112, 64 0 ['conv1/bn[0][0]']
)
zero_padding2d_1 (ZeroPadding2 (None, 114, 114, 64 0 ['conv1/relu[0][0]']
D) )
pool1 (MaxPooling2D) (None, 56, 56, 64) 0 ['zero_padding2d_1[0][0]']
conv2_block1_0_bn (BatchNormal (None, 56, 56, 64) 256 ['pool1[0][0]']
ization)
conv2_block1_0_relu (Activatio (None, 56, 56, 64) 0 ['conv2_block1_0_bn[0][0]']
n)
conv2_block1_1_conv (Conv2D) (None, 56, 56, 128) 8192 ['conv2_block1_0_relu[0][0]']
conv2_block1_1_bn (BatchNormal (None, 56, 56, 128) 512 ['conv2_block1_1_conv[0][0]']
ization)
conv2_block1_1_relu (Activatio (None, 56, 56, 128) 0 ['conv2_block1_1_bn[0][0]']
n)
conv2_block1_2_conv (Conv2D) (None, 56, 56, 32) 36864 ['conv2_block1_1_relu[0][0]']
conv2_block1_concat (Concatena (None, 56, 56, 96) 0 ['pool1[0][0]',
te) 'conv2_block1_2_conv[0][0]']
conv2_block2_0_bn (BatchNormal (None, 56, 56, 96) 384 ['conv2_block1_concat[0][0]']
ization)
conv2_block2_0_relu (Activatio (None, 56, 56, 96) 0 ['conv2_block2_0_bn[0][0]']
n)
conv2_block2_1_conv (Conv2D) (None, 56, 56, 128) 12288 ['conv2_block2_0_relu[0][0]']
conv2_block2_1_bn (BatchNormal (None, 56, 56, 128) 512 ['conv2_block2_1_conv[0][0]']
ization)
conv2_block2_1_relu (Activatio (None, 56, 56, 128) 0 ['conv2_block2_1_bn[0][0]']
n)
conv2_block2_2_conv (Conv2D) (None, 56, 56, 32) 36864 ['conv2_block2_1_relu[0][0]']
conv2_block2_concat (Concatena (None, 56, 56, 128) 0 ['conv2_block1_concat[0][0]',
te) 'conv2_block2_2_conv[0][0]']
conv2_block3_0_bn (BatchNormal (None, 56, 56, 128) 512 ['conv2_block2_concat[0][0]']
ization)
conv2_block3_0_relu (Activatio (None, 56, 56, 128) 0 ['conv2_block3_0_bn[0][0]']
n)
conv2_block3_1_conv (Conv2D) (None, 56, 56, 128) 16384 ['conv2_block3_0_relu[0][0]']
conv2_block3_1_bn (BatchNormal (None, 56, 56, 128) 512 ['conv2_block3_1_conv[0][0]']
ization)
conv2_block3_1_relu (Activatio (None, 56, 56, 128) 0 ['conv2_block3_1_bn[0][0]']
n)
conv2_block3_2_conv (Conv2D) (None, 56, 56, 32) 36864 ['conv2_block3_1_relu[0][0]']
conv2_block3_concat (Concatena (None, 56, 56, 160) 0 ['conv2_block2_concat[0][0]',
te) 'conv2_block3_2_conv[0][0]']
conv2_block4_0_bn (BatchNormal (None, 56, 56, 160) 640 ['conv2_block3_concat[0][0]']
ization)
conv2_block4_0_relu (Activatio (None, 56, 56, 160) 0 ['conv2_block4_0_bn[0][0]']
n)
conv2_block4_1_conv (Conv2D) (None, 56, 56, 128) 20480 ['conv2_block4_0_relu[0][0]']
conv2_block4_1_bn (BatchNormal (None, 56, 56, 128) 512 ['conv2_block4_1_conv[0][0]']
ization)
conv2_block4_1_relu (Activatio (None, 56, 56, 128) 0 ['conv2_block4_1_bn[0][0]']
n)
conv2_block4_2_conv (Conv2D) (None, 56, 56, 32) 36864 ['conv2_block4_1_relu[0][0]']
conv2_block4_concat (Concatena (None, 56, 56, 192) 0 ['conv2_block3_concat[0][0]',
te) 'conv2_block4_2_conv[0][0]']
conv2_block5_0_bn (BatchNormal (None, 56, 56, 192) 768 ['conv2_block4_concat[0][0]']
ization)
conv2_block5_0_relu (Activatio (None, 56, 56, 192) 0 ['conv2_block5_0_bn[0][0]']
n)
conv2_block5_1_conv (Conv2D) (None, 56, 56, 128) 24576 ['conv2_block5_0_relu[0][0]']
conv2_block5_1_bn (BatchNormal (None, 56, 56, 128) 512 ['conv2_block5_1_conv[0][0]']
ization)
conv2_block5_1_relu (Activatio (None, 56, 56, 128) 0 ['conv2_block5_1_bn[0][0]']
n)
conv2_block5_2_conv (Conv2D) (None, 56, 56, 32) 36864 ['conv2_block5_1_relu[0][0]']
conv2_block5_concat (Concatena (None, 56, 56, 224) 0 ['conv2_block4_concat[0][0]',
te) 'conv2_block5_2_conv[0][0]']
conv2_block6_0_bn (BatchNormal (None, 56, 56, 224) 896 ['conv2_block5_concat[0][0]']
ization)
conv2_block6_0_relu (Activatio (None, 56, 56, 224) 0 ['conv2_block6_0_bn[0][0]']
n)
conv2_block6_1_conv (Conv2D) (None, 56, 56, 128) 28672 ['conv2_block6_0_relu[0][0]']
conv2_block6_1_bn (BatchNormal (None, 56, 56, 128) 512 ['conv2_block6_1_conv[0][0]']
ization)
conv2_block6_1_relu (Activatio (None, 56, 56, 128) 0 ['conv2_block6_1_bn[0][0]']
n)
conv2_block6_2_conv (Conv2D) (None, 56, 56, 32) 36864 ['conv2_block6_1_relu[0][0]']
conv2_block6_concat (Concatena (None, 56, 56, 256) 0 ['conv2_block5_concat[0][0]',
te) 'conv2_block6_2_conv[0][0]']
pool2_bn (BatchNormalization) (None, 56, 56, 256) 1024 ['conv2_block6_concat[0][0]']
pool2_relu (Activation) (None, 56, 56, 256) 0 ['pool2_bn[0][0]']
pool2_conv (Conv2D) (None, 56, 56, 128) 32768 ['pool2_relu[0][0]']
pool2_pool (AveragePooling2D) (None, 28, 28, 128) 0 ['pool2_conv[0][0]']
conv3_block1_0_bn (BatchNormal (None, 28, 28, 128) 512 ['pool2_pool[0][0]']
ization)
conv3_block1_0_relu (Activatio (None, 28, 28, 128) 0 ['conv3_block1_0_bn[0][0]']
n)
conv3_block1_1_conv (Conv2D) (None, 28, 28, 128) 16384 ['conv3_block1_0_relu[0][0]']
conv3_block1_1_bn (BatchNormal (None, 28, 28, 128) 512 ['conv3_block1_1_conv[0][0]']
ization)
conv3_block1_1_relu (Activatio (None, 28, 28, 128) 0 ['conv3_block1_1_bn[0][0]']
n)
conv3_block1_2_conv (Conv2D) (None, 28, 28, 32) 36864 ['conv3_block1_1_relu[0][0]']
conv3_block1_concat (Concatena (None, 28, 28, 160) 0 ['pool2_pool[0][0]',
te) 'conv3_block1_2_conv[0][0]']
conv3_block2_0_bn (BatchNormal (None, 28, 28, 160) 640 ['conv3_block1_concat[0][0]']
ization)
conv3_block2_0_relu (Activatio (None, 28, 28, 160) 0 ['conv3_block2_0_bn[0][0]']
n)
conv3_block2_1_conv (Conv2D) (None, 28, 28, 128) 20480 ['conv3_block2_0_relu[0][0]']
conv3_block2_1_bn (BatchNormal (None, 28, 28, 128) 512 ['conv3_block2_1_conv[0][0]']
ization)
conv3_block2_1_relu (Activatio (None, 28, 28, 128) 0 ['conv3_block2_1_bn[0][0]']
n)
conv3_block2_2_conv (Conv2D) (None, 28, 28, 32) 36864 ['conv3_block2_1_relu[0][0]']
conv3_block2_concat (Concatena (None, 28, 28, 192) 0 ['conv3_block1_concat[0][0]',
te) 'conv3_block2_2_conv[0][0]']
conv3_block3_0_bn (BatchNormal (None, 28, 28, 192) 768 ['conv3_block2_concat[0][0]']
ization)
conv3_block3_0_relu (Activatio (None, 28, 28, 192) 0 ['conv3_block3_0_bn[0][0]']
n)
conv3_block3_1_conv (Conv2D) (None, 28, 28, 128) 24576 ['conv3_block3_0_relu[0][0]']
conv3_block3_1_bn (BatchNormal (None, 28, 28, 128) 512 ['conv3_block3_1_conv[0][0]']
ization)
conv3_block3_1_relu (Activatio (None, 28, 28, 128) 0 ['conv3_block3_1_bn[0][0]']
n)
conv3_block3_2_conv (Conv2D) (None, 28, 28, 32) 36864 ['conv3_block3_1_relu[0][0]']
conv3_block3_concat (Concatena (None, 28, 28, 224) 0 ['conv3_block2_concat[0][0]',
te) 'conv3_block3_2_conv[0][0]']
conv3_block4_0_bn (BatchNormal (None, 28, 28, 224) 896 ['conv3_block3_concat[0][0]']
ization)
conv3_block4_0_relu (Activatio (None, 28, 28, 224) 0 ['conv3_block4_0_bn[0][0]']
n)
conv3_block4_1_conv (Conv2D) (None, 28, 28, 128) 28672 ['conv3_block4_0_relu[0][0]']
conv3_block4_1_bn (BatchNormal (None, 28, 28, 128) 512 ['conv3_block4_1_conv[0][0]']
ization)
conv3_block4_1_relu (Activatio (None, 28, 28, 128) 0 ['conv3_block4_1_bn[0][0]']
n)
conv3_block4_2_conv (Conv2D) (None, 28, 28, 32) 36864 ['conv3_block4_1_relu[0][0]']
conv3_block4_concat (Concatena (None, 28, 28, 256) 0 ['conv3_block3_concat[0][0]',
te) 'conv3_block4_2_conv[0][0]']
conv3_block5_0_bn (BatchNormal (None, 28, 28, 256) 1024 ['conv3_block4_concat[0][0]']
ization)
conv3_block5_0_relu (Activatio (None, 28, 28, 256) 0 ['conv3_block5_0_bn[0][0]']
n)
conv3_block5_1_conv (Conv2D) (None, 28, 28, 128) 32768 ['conv3_block5_0_relu[0][0]']
conv3_block5_1_bn (BatchNormal (None, 28, 28, 128) 512 ['conv3_block5_1_conv[0][0]']
ization)
conv3_block5_1_relu (Activatio (None, 28, 28, 128) 0 ['conv3_block5_1_bn[0][0]']
n)
conv3_block5_2_conv (Conv2D) (None, 28, 28, 32) 36864 ['conv3_block5_1_relu[0][0]']
conv3_block5_concat (Concatena (None, 28, 28, 288) 0 ['conv3_block4_concat[0][0]',
te) 'conv3_block5_2_conv[0][0]']
conv3_block6_0_bn (BatchNormal (None, 28, 28, 288) 1152 ['conv3_block5_concat[0][0]']
ization)
conv3_block6_0_relu (Activatio (None, 28, 28, 288) 0 ['conv3_block6_0_bn[0][0]']
n)
conv3_block6_1_conv (Conv2D) (None, 28, 28, 128) 36864 ['conv3_block6_0_relu[0][0]']
conv3_block6_1_bn (BatchNormal (None, 28, 28, 128) 512 ['conv3_block6_1_conv[0][0]']
ization)
conv3_block6_1_relu (Activatio (None, 28, 28, 128) 0 ['conv3_block6_1_bn[0][0]']
n)
conv3_block6_2_conv (Conv2D) (None, 28, 28, 32) 36864 ['conv3_block6_1_relu[0][0]']
conv3_block6_concat (Concatena (None, 28, 28, 320) 0 ['conv3_block5_concat[0][0]',
te) 'conv3_block6_2_conv[0][0]']
conv3_block7_0_bn (BatchNormal (None, 28, 28, 320) 1280 ['conv3_block6_concat[0][0]']
ization)
conv3_block7_0_relu (Activatio (None, 28, 28, 320) 0 ['conv3_block7_0_bn[0][0]']
n)
conv3_block7_1_conv (Conv2D) (None, 28, 28, 128) 40960 ['conv3_block7_0_relu[0][0]']
conv3_block7_1_bn (BatchNormal (None, 28, 28, 128) 512 ['conv3_block7_1_conv[0][0]']
ization)
conv3_block7_1_relu (Activatio (None, 28, 28, 128) 0 ['conv3_block7_1_bn[0][0]']
n)
conv3_block7_2_conv (Conv2D) (None, 28, 28, 32) 36864 ['conv3_block7_1_relu[0][0]']
conv3_block7_concat (Concatena (None, 28, 28, 352) 0 ['conv3_block6_concat[0][0]',
te) 'conv3_block7_2_conv[0][0]']
conv3_block8_0_bn (BatchNormal (None, 28, 28, 352) 1408 ['conv3_block7_concat[0][0]']
ization)
conv3_block8_0_relu (Activatio (None, 28, 28, 352) 0 ['conv3_block8_0_bn[0][0]']
n)
conv3_block8_1_conv (Conv2D) (None, 28, 28, 128) 45056 ['conv3_block8_0_relu[0][0]']
conv3_block8_1_bn (BatchNormal (None, 28, 28, 128) 512 ['conv3_block8_1_conv[0][0]']
ization)
conv3_block8_1_relu (Activatio (None, 28, 28, 128) 0 ['conv3_block8_1_bn[0][0]']
n)
conv3_block8_2_conv (Conv2D) (None, 28, 28, 32) 36864 ['conv3_block8_1_relu[0][0]']
conv3_block8_concat (Concatena (None, 28, 28, 384) 0 ['conv3_block7_concat[0][0]',
te) 'conv3_block8_2_conv[0][0]']
conv3_block9_0_bn (BatchNormal (None, 28, 28, 384) 1536 ['conv3_block8_concat[0][0]']
ization)
conv3_block9_0_relu (Activatio (None, 28, 28, 384) 0 ['conv3_block9_0_bn[0][0]']
n)
conv3_block9_1_conv (Conv2D) (None, 28, 28, 128) 49152 ['conv3_block9_0_relu[0][0]']
conv3_block9_1_bn (BatchNormal (None, 28, 28, 128) 512 ['conv3_block9_1_conv[0][0]']
ization)
conv3_block9_1_relu (Activatio (None, 28, 28, 128) 0 ['conv3_block9_1_bn[0][0]']
n)
conv3_block9_2_conv (Conv2D) (None, 28, 28, 32) 36864 ['conv3_block9_1_relu[0][0]']
conv3_block9_concat (Concatena (None, 28, 28, 416) 0 ['conv3_block8_concat[0][0]',
te) 'conv3_block9_2_conv[0][0]']
conv3_block10_0_bn (BatchNorma (None, 28, 28, 416) 1664 ['conv3_block9_concat[0][0]']
lization)
conv3_block10_0_relu (Activati (None, 28, 28, 416) 0 ['conv3_block10_0_bn[0][0]']
on)
conv3_block10_1_conv (Conv2D) (None, 28, 28, 128) 53248 ['conv3_block10_0_relu[0][0]']
conv3_block10_1_bn (BatchNorma (None, 28, 28, 128) 512 ['conv3_block10_1_conv[0][0]']
lization)
conv3_block10_1_relu (Activati (None, 28, 28, 128) 0 ['conv3_block10_1_bn[0][0]']
on)
conv3_block10_2_conv (Conv2D) (None, 28, 28, 32) 36864 ['conv3_block10_1_relu[0][0]']
conv3_block10_concat (Concaten (None, 28, 28, 448) 0 ['conv3_block9_concat[0][0]',
ate) 'conv3_block10_2_conv[0][0]']
conv3_block11_0_bn (BatchNorma (None, 28, 28, 448) 1792 ['conv3_block10_concat[0][0]']
lization)
conv3_block11_0_relu (Activati (None, 28, 28, 448) 0 ['conv3_block11_0_bn[0][0]']
on)
conv3_block11_1_conv (Conv2D) (None, 28, 28, 128) 57344 ['conv3_block11_0_relu[0][0]']
conv3_block11_1_bn (BatchNorma (None, 28, 28, 128) 512 ['conv3_block11_1_conv[0][0]']
lization)
conv3_block11_1_relu (Activati (None, 28, 28, 128) 0 ['conv3_block11_1_bn[0][0]']
on)
conv3_block11_2_conv (Conv2D) (None, 28, 28, 32) 36864 ['conv3_block11_1_relu[0][0]']
conv3_block11_concat (Concaten (None, 28, 28, 480) 0 ['conv3_block10_concat[0][0]',
ate) 'conv3_block11_2_conv[0][0]']
conv3_block12_0_bn (BatchNorma (None, 28, 28, 480) 1920 ['conv3_block11_concat[0][0]']
lization)
conv3_block12_0_relu (Activati (None, 28, 28, 480) 0 ['conv3_block12_0_bn[0][0]']
on)
conv3_block12_1_conv (Conv2D) (None, 28, 28, 128) 61440 ['conv3_block12_0_relu[0][0]']
conv3_block12_1_bn (BatchNorma (None, 28, 28, 128) 512 ['conv3_block12_1_conv[0][0]']
lization)
conv3_block12_1_relu (Activati (None, 28, 28, 128) 0 ['conv3_block12_1_bn[0][0]']
on)
conv3_block12_2_conv (Conv2D) (None, 28, 28, 32) 36864 ['conv3_block12_1_relu[0][0]']
conv3_block12_concat (Concaten (None, 28, 28, 512) 0 ['conv3_block11_concat[0][0]',
ate) 'conv3_block12_2_conv[0][0]']
pool3_bn (BatchNormalization) (None, 28, 28, 512) 2048 ['conv3_block12_concat[0][0]']
pool3_relu (Activation) (None, 28, 28, 512) 0 ['pool3_bn[0][0]']
pool3_conv (Conv2D) (None, 28, 28, 256) 131072 ['pool3_relu[0][0]']
pool3_pool (AveragePooling2D) (None, 14, 14, 256) 0 ['pool3_conv[0][0]']
conv4_block1_0_bn (BatchNormal (None, 14, 14, 256) 1024 ['pool3_pool[0][0]']
ization)
conv4_block1_0_relu (Activatio (None, 14, 14, 256) 0 ['conv4_block1_0_bn[0][0]']
n)
conv4_block1_1_conv (Conv2D) (None, 14, 14, 128) 32768 ['conv4_block1_0_relu[0][0]']
conv4_block1_1_bn (BatchNormal (None, 14, 14, 128) 512 ['conv4_block1_1_conv[0][0]']
ization)
conv4_block1_1_relu (Activatio (None, 14, 14, 128) 0 ['conv4_block1_1_bn[0][0]']
n)
conv4_block1_2_conv (Conv2D) (None, 14, 14, 32) 36864 ['conv4_block1_1_relu[0][0]']
conv4_block1_concat (Concatena (None, 14, 14, 288) 0 ['pool3_pool[0][0]',
te) 'conv4_block1_2_conv[0][0]']
conv4_block2_0_bn (BatchNormal (None, 14, 14, 288) 1152 ['conv4_block1_concat[0][0]']
ization)
conv4_block2_0_relu (Activatio (None, 14, 14, 288) 0 ['conv4_block2_0_bn[0][0]']
n)
conv4_block2_1_conv (Conv2D) (None, 14, 14, 128) 36864 ['conv4_block2_0_relu[0][0]']
conv4_block2_1_bn (BatchNormal (None, 14, 14, 128) 512 ['conv4_block2_1_conv[0][0]']
ization)
conv4_block2_1_relu (Activatio (None, 14, 14, 128) 0 ['conv4_block2_1_bn[0][0]']
n)
conv4_block2_2_conv (Conv2D) (None, 14, 14, 32) 36864 ['conv4_block2_1_relu[0][0]']
conv4_block2_concat (Concatena (None, 14, 14, 320) 0 ['conv4_block1_concat[0][0]',
te) 'conv4_block2_2_conv[0][0]']
conv4_block3_0_bn (BatchNormal (None, 14, 14, 320) 1280 ['conv4_block2_concat[0][0]']
ization)
conv4_block3_0_relu (Activatio (None, 14, 14, 320) 0 ['conv4_block3_0_bn[0][0]']
n)
conv4_block3_1_conv (Conv2D) (None, 14, 14, 128) 40960 ['conv4_block3_0_relu[0][0]']
conv4_block3_1_bn (BatchNormal (None, 14, 14, 128) 512 ['conv4_block3_1_conv[0][0]']
ization)
conv4_block3_1_relu (Activatio (None, 14, 14, 128) 0 ['conv4_block3_1_bn[0][0]']
n)
conv4_block3_2_conv (Conv2D) (None, 14, 14, 32) 36864 ['conv4_block3_1_relu[0][0]']
conv4_block3_concat (Concatena (None, 14, 14, 352) 0 ['conv4_block2_concat[0][0]',
te) 'conv4_block3_2_conv[0][0]']
conv4_block4_0_bn (BatchNormal (None, 14, 14, 352) 1408 ['conv4_block3_concat[0][0]']
ization)
conv4_block4_0_relu (Activatio (None, 14, 14, 352) 0 ['conv4_block4_0_bn[0][0]']
n)
conv4_block4_1_conv (Conv2D) (None, 14, 14, 128) 45056 ['conv4_block4_0_relu[0][0]']
conv4_block4_1_bn (BatchNormal (None, 14, 14, 128) 512 ['conv4_block4_1_conv[0][0]']
ization)
conv4_block4_1_relu (Activatio (None, 14, 14, 128) 0 ['conv4_block4_1_bn[0][0]']
n)
conv4_block4_2_conv (Conv2D) (None, 14, 14, 32) 36864 ['conv4_block4_1_relu[0][0]']
conv4_block4_concat (Concatena (None, 14, 14, 384) 0 ['conv4_block3_concat[0][0]',
te) 'conv4_block4_2_conv[0][0]']
conv4_block5_0_bn (BatchNormal (None, 14, 14, 384) 1536 ['conv4_block4_concat[0][0]']
ization)
conv4_block5_0_relu (Activatio (None, 14, 14, 384) 0 ['conv4_block5_0_bn[0][0]']
n)
conv4_block5_1_conv (Conv2D) (None, 14, 14, 128) 49152 ['conv4_block5_0_relu[0][0]']
conv4_block5_1_bn (BatchNormal (None, 14, 14, 128) 512 ['conv4_block5_1_conv[0][0]']
ization)
conv4_block5_1_relu (Activatio (None, 14, 14, 128) 0 ['conv4_block5_1_bn[0][0]']
n)
conv4_block5_2_conv (Conv2D) (None, 14, 14, 32) 36864 ['conv4_block5_1_relu[0][0]']
conv4_block5_concat (Concatena (None, 14, 14, 416) 0 ['conv4_block4_concat[0][0]',
te) 'conv4_block5_2_conv[0][0]']
conv4_block6_0_bn (BatchNormal (None, 14, 14, 416) 1664 ['conv4_block5_concat[0][0]']
ization)
conv4_block6_0_relu (Activatio (None, 14, 14, 416) 0 ['conv4_block6_0_bn[0][0]']
n)
conv4_block6_1_conv (Conv2D) (None, 14, 14, 128) 53248 ['conv4_block6_0_relu[0][0]']
conv4_block6_1_bn (BatchNormal (None, 14, 14, 128) 512 ['conv4_block6_1_conv[0][0]']
ization)
conv4_block6_1_relu (Activatio (None, 14, 14, 128) 0 ['conv4_block6_1_bn[0][0]']
n)
conv4_block6_2_conv (Conv2D) (None, 14, 14, 32) 36864 ['conv4_block6_1_relu[0][0]']
conv4_block6_concat (Concatena (None, 14, 14, 448) 0 ['conv4_block5_concat[0][0]',
te) 'conv4_block6_2_conv[0][0]']
conv4_block7_0_bn (BatchNormal (None, 14, 14, 448) 1792 ['conv4_block6_concat[0][0]']
ization)
conv4_block7_0_relu (Activatio (None, 14, 14, 448) 0 ['conv4_block7_0_bn[0][0]']
n)
conv4_block7_1_conv (Conv2D) (None, 14, 14, 128) 57344 ['conv4_block7_0_relu[0][0]']
conv4_block7_1_bn (BatchNormal (None, 14, 14, 128) 512 ['conv4_block7_1_conv[0][0]']
ization)
conv4_block7_1_relu (Activatio (None, 14, 14, 128) 0 ['conv4_block7_1_bn[0][0]']
n)
conv4_block7_2_conv (Conv2D) (None, 14, 14, 32) 36864 ['conv4_block7_1_relu[0][0]']
conv4_block7_concat (Concatena (None, 14, 14, 480) 0 ['conv4_block6_concat[0][0]',
te) 'conv4_block7_2_conv[0][0]']
conv4_block8_0_bn (BatchNormal (None, 14, 14, 480) 1920 ['conv4_block7_concat[0][0]']
ization)
conv4_block8_0_relu (Activatio (None, 14, 14, 480) 0 ['conv4_block8_0_bn[0][0]']
n)
conv4_block8_1_conv (Conv2D) (None, 14, 14, 128) 61440 ['conv4_block8_0_relu[0][0]']
conv4_block8_1_bn (BatchNormal (None, 14, 14, 128) 512 ['conv4_block8_1_conv[0][0]']
ization)
conv4_block8_1_relu (Activatio (None, 14, 14, 128) 0 ['conv4_block8_1_bn[0][0]']
n)
conv4_block8_2_conv (Conv2D) (None, 14, 14, 32) 36864 ['conv4_block8_1_relu[0][0]']
conv4_block8_concat (Concatena (None, 14, 14, 512) 0 ['conv4_block7_concat[0][0]',
te) 'conv4_block8_2_conv[0][0]']
conv4_block9_0_bn (BatchNormal (None, 14, 14, 512) 2048 ['conv4_block8_concat[0][0]']
ization)
conv4_block9_0_relu (Activatio (None, 14, 14, 512) 0 ['conv4_block9_0_bn[0][0]']
n)
conv4_block9_1_conv (Conv2D) (None, 14, 14, 128) 65536 ['conv4_block9_0_relu[0][0]']
conv4_block9_1_bn (BatchNormal (None, 14, 14, 128) 512 ['conv4_block9_1_conv[0][0]']
ization)
conv4_block9_1_relu (Activatio (None, 14, 14, 128) 0 ['conv4_block9_1_bn[0][0]']
n)
conv4_block9_2_conv (Conv2D) (None, 14, 14, 32) 36864 ['conv4_block9_1_relu[0][0]']
conv4_block9_concat (Concatena (None, 14, 14, 544) 0 ['conv4_block8_concat[0][0]',
te) 'conv4_block9_2_conv[0][0]']
conv4_block10_0_bn (BatchNorma (None, 14, 14, 544) 2176 ['conv4_block9_concat[0][0]']
lization)
conv4_block10_0_relu (Activati (None, 14, 14, 544) 0 ['conv4_block10_0_bn[0][0]']
on)
conv4_block10_1_conv (Conv2D) (None, 14, 14, 128) 69632 ['conv4_block10_0_relu[0][0]']
conv4_block10_1_bn (BatchNorma (None, 14, 14, 128) 512 ['conv4_block10_1_conv[0][0]']
lization)
conv4_block10_1_relu (Activati (None, 14, 14, 128) 0 ['conv4_block10_1_bn[0][0]']
on)
conv4_block10_2_conv (Conv2D) (None, 14, 14, 32) 36864 ['conv4_block10_1_relu[0][0]']
conv4_block10_concat (Concaten (None, 14, 14, 576) 0 ['conv4_block9_concat[0][0]',
ate) 'conv4_block10_2_conv[0][0]']
conv4_block11_0_bn (BatchNorma (None, 14, 14, 576) 2304 ['conv4_block10_concat[0][0]']
lization)
conv4_block11_0_relu (Activati (None, 14, 14, 576) 0 ['conv4_block11_0_bn[0][0]']
on)
conv4_block11_1_conv (Conv2D) (None, 14, 14, 128) 73728 ['conv4_block11_0_relu[0][0]']
conv4_block11_1_bn (BatchNorma (None, 14, 14, 128) 512 ['conv4_block11_1_conv[0][0]']
lization)
conv4_block11_1_relu (Activati (None, 14, 14, 128) 0 ['conv4_block11_1_bn[0][0]']
on)
conv4_block11_2_conv (Conv2D) (None, 14, 14, 32) 36864 ['conv4_block11_1_relu[0][0]']
conv4_block11_concat (Concaten (None, 14, 14, 608) 0 ['conv4_block10_concat[0][0]',
ate) 'conv4_block11_2_conv[0][0]']
conv4_block12_0_bn (BatchNorma (None, 14, 14, 608) 2432 ['conv4_block11_concat[0][0]']
lization)
conv4_block12_0_relu (Activati (None, 14, 14, 608) 0 ['conv4_block12_0_bn[0][0]']
on)
conv4_block12_1_conv (Conv2D) (None, 14, 14, 128) 77824 ['conv4_block12_0_relu[0][0]']
conv4_block12_1_bn (BatchNorma (None, 14, 14, 128) 512 ['conv4_block12_1_conv[0][0]']
lization)
conv4_block12_1_relu (Activati (None, 14, 14, 128) 0 ['conv4_block12_1_bn[0][0]']
on)
conv4_block12_2_conv (Conv2D) (None, 14, 14, 32) 36864 ['conv4_block12_1_relu[0][0]']
conv4_block12_concat (Concaten (None, 14, 14, 640) 0 ['conv4_block11_concat[0][0]',
ate) 'conv4_block12_2_conv[0][0]']
conv4_block13_0_bn (BatchNorma (None, 14, 14, 640) 2560 ['conv4_block12_concat[0][0]']
lization)
conv4_block13_0_relu (Activati (None, 14, 14, 640) 0 ['conv4_block13_0_bn[0][0]']
on)
conv4_block13_1_conv (Conv2D) (None, 14, 14, 128) 81920 ['conv4_block13_0_relu[0][0]']
conv4_block13_1_bn (BatchNorma (None, 14, 14, 128) 512 ['conv4_block13_1_conv[0][0]']
lization)
conv4_block13_1_relu (Activati (None, 14, 14, 128) 0 ['conv4_block13_1_bn[0][0]']
on)
conv4_block13_2_conv (Conv2D) (None, 14, 14, 32) 36864 ['conv4_block13_1_relu[0][0]']
conv4_block13_concat (Concaten (None, 14, 14, 672) 0 ['conv4_block12_concat[0][0]',
ate) 'conv4_block13_2_conv[0][0]']
conv4_block14_0_bn (BatchNorma (None, 14, 14, 672) 2688 ['conv4_block13_concat[0][0]']
lization)
conv4_block14_0_relu (Activati (None, 14, 14, 672) 0 ['conv4_block14_0_bn[0][0]']
on)
conv4_block14_1_conv (Conv2D) (None, 14, 14, 128) 86016 ['conv4_block14_0_relu[0][0]']
conv4_block14_1_bn (BatchNorma (None, 14, 14, 128) 512 ['conv4_block14_1_conv[0][0]']
lization)
conv4_block14_1_relu (Activati (None, 14, 14, 128) 0 ['conv4_block14_1_bn[0][0]']
on)
conv4_block14_2_conv (Conv2D) (None, 14, 14, 32) 36864 ['conv4_block14_1_relu[0][0]']
conv4_block14_concat (Concaten (None, 14, 14, 704) 0 ['conv4_block13_concat[0][0]',
ate) 'conv4_block14_2_conv[0][0]']
conv4_block15_0_bn (BatchNorma (None, 14, 14, 704) 2816 ['conv4_block14_concat[0][0]']
lization)
conv4_block15_0_relu (Activati (None, 14, 14, 704) 0 ['conv4_block15_0_bn[0][0]']
on)
conv4_block15_1_conv (Conv2D) (None, 14, 14, 128) 90112 ['conv4_block15_0_relu[0][0]']
conv4_block15_1_bn (BatchNorma (None, 14, 14, 128) 512 ['conv4_block15_1_conv[0][0]']
lization)
conv4_block15_1_relu (Activati (None, 14, 14, 128) 0 ['conv4_block15_1_bn[0][0]']
on)
conv4_block15_2_conv (Conv2D) (None, 14, 14, 32) 36864 ['conv4_block15_1_relu[0][0]']
conv4_block15_concat (Concaten (None, 14, 14, 736) 0 ['conv4_block14_concat[0][0]',
ate) 'conv4_block15_2_conv[0][0]']
conv4_block16_0_bn (BatchNorma (None, 14, 14, 736) 2944 ['conv4_block15_concat[0][0]']
lization)
conv4_block16_0_relu (Activati (None, 14, 14, 736) 0 ['conv4_block16_0_bn[0][0]']
on)
conv4_block16_1_conv (Conv2D) (None, 14, 14, 128) 94208 ['conv4_block16_0_relu[0][0]']
conv4_block16_1_bn (BatchNorma (None, 14, 14, 128) 512 ['conv4_block16_1_conv[0][0]']
lization)
conv4_block16_1_relu (Activati (None, 14, 14, 128) 0 ['conv4_block16_1_bn[0][0]']
on)
conv4_block16_2_conv (Conv2D) (None, 14, 14, 32) 36864 ['conv4_block16_1_relu[0][0]']
conv4_block16_concat (Concaten (None, 14, 14, 768) 0 ['conv4_block15_concat[0][0]',
ate) 'conv4_block16_2_conv[0][0]']
conv4_block17_0_bn (BatchNorma (None, 14, 14, 768) 3072 ['conv4_block16_concat[0][0]']
lization)
conv4_block17_0_relu (Activati (None, 14, 14, 768) 0 ['conv4_block17_0_bn[0][0]']
on)
conv4_block17_1_conv (Conv2D) (None, 14, 14, 128) 98304 ['conv4_block17_0_relu[0][0]']
conv4_block17_1_bn (BatchNorma (None, 14, 14, 128) 512 ['conv4_block17_1_conv[0][0]']
lization)
conv4_block17_1_relu (Activati (None, 14, 14, 128) 0 ['conv4_block17_1_bn[0][0]']
on)
conv4_block17_2_conv (Conv2D) (None, 14, 14, 32) 36864 ['conv4_block17_1_relu[0][0]']
conv4_block17_concat (Concaten (None, 14, 14, 800) 0 ['conv4_block16_concat[0][0]',
ate) 'conv4_block17_2_conv[0][0]']
conv4_block18_0_bn (BatchNorma (None, 14, 14, 800) 3200 ['conv4_block17_concat[0][0]']
lization)
conv4_block18_0_relu (Activati (None, 14, 14, 800) 0 ['conv4_block18_0_bn[0][0]']
on)
conv4_block18_1_conv (Conv2D) (None, 14, 14, 128) 102400 ['conv4_block18_0_relu[0][0]']
conv4_block18_1_bn (BatchNorma (None, 14, 14, 128) 512 ['conv4_block18_1_conv[0][0]']
lization)
conv4_block18_1_relu (Activati (None, 14, 14, 128) 0 ['conv4_block18_1_bn[0][0]']
on)
conv4_block18_2_conv (Conv2D) (None, 14, 14, 32) 36864 ['conv4_block18_1_relu[0][0]']
conv4_block18_concat (Concaten (None, 14, 14, 832) 0 ['conv4_block17_concat[0][0]',
ate) 'conv4_block18_2_conv[0][0]']
conv4_block19_0_bn (BatchNorma (None, 14, 14, 832) 3328 ['conv4_block18_concat[0][0]']
lization)
conv4_block19_0_relu (Activati (None, 14, 14, 832) 0 ['conv4_block19_0_bn[0][0]']
on)
conv4_block19_1_conv (Conv2D) (None, 14, 14, 128) 106496 ['conv4_block19_0_relu[0][0]']
conv4_block19_1_bn (BatchNorma (None, 14, 14, 128) 512 ['conv4_block19_1_conv[0][0]']
lization)
conv4_block19_1_relu (Activati (None, 14, 14, 128) 0 ['conv4_block19_1_bn[0][0]']
on)
conv4_block19_2_conv (Conv2D) (None, 14, 14, 32) 36864 ['conv4_block19_1_relu[0][0]']
conv4_block19_concat (Concaten (None, 14, 14, 864) 0 ['conv4_block18_concat[0][0]',
ate) 'conv4_block19_2_conv[0][0]']
conv4_block20_0_bn (BatchNorma (None, 14, 14, 864) 3456 ['conv4_block19_concat[0][0]']
lization)
conv4_block20_0_relu (Activati (None, 14, 14, 864) 0 ['conv4_block20_0_bn[0][0]']
on)
conv4_block20_1_conv (Conv2D) (None, 14, 14, 128) 110592 ['conv4_block20_0_relu[0][0]']
conv4_block20_1_bn (BatchNorma (None, 14, 14, 128) 512 ['conv4_block20_1_conv[0][0]']
lization)
conv4_block20_1_relu (Activati (None, 14, 14, 128) 0 ['conv4_block20_1_bn[0][0]']
on)
conv4_block20_2_conv (Conv2D) (None, 14, 14, 32) 36864 ['conv4_block20_1_relu[0][0]']
conv4_block20_concat (Concaten (None, 14, 14, 896) 0 ['conv4_block19_concat[0][0]',
ate) 'conv4_block20_2_conv[0][0]']
conv4_block21_0_bn (BatchNorma (None, 14, 14, 896) 3584 ['conv4_block20_concat[0][0]']
lization)
conv4_block21_0_relu (Activati (None, 14, 14, 896) 0 ['conv4_block21_0_bn[0][0]']
on)
conv4_block21_1_conv (Conv2D) (None, 14, 14, 128) 114688 ['conv4_block21_0_relu[0][0]']
conv4_block21_1_bn (BatchNorma (None, 14, 14, 128) 512 ['conv4_block21_1_conv[0][0]']
lization)
conv4_block21_1_relu (Activati (None, 14, 14, 128) 0 ['conv4_block21_1_bn[0][0]']
on)
conv4_block21_2_conv (Conv2D) (None, 14, 14, 32) 36864 ['conv4_block21_1_relu[0][0]']
conv4_block21_concat (Concaten (None, 14, 14, 928) 0 ['conv4_block20_concat[0][0]',
ate) 'conv4_block21_2_conv[0][0]']
conv4_block22_0_bn (BatchNorma (None, 14, 14, 928) 3712 ['conv4_block21_concat[0][0]']
lization)
conv4_block22_0_relu (Activati (None, 14, 14, 928) 0 ['conv4_block22_0_bn[0][0]']
on)
conv4_block22_1_conv (Conv2D) (None, 14, 14, 128) 118784 ['conv4_block22_0_relu[0][0]']
conv4_block22_1_bn (BatchNorma (None, 14, 14, 128) 512 ['conv4_block22_1_conv[0][0]']
lization)
conv4_block22_1_relu (Activati (None, 14, 14, 128) 0 ['conv4_block22_1_bn[0][0]']
on)
conv4_block22_2_conv (Conv2D) (None, 14, 14, 32) 36864 ['conv4_block22_1_relu[0][0]']
conv4_block22_concat (Concaten (None, 14, 14, 960) 0 ['conv4_block21_concat[0][0]',
ate) 'conv4_block22_2_conv[0][0]']
conv4_block23_0_bn (BatchNorma (None, 14, 14, 960) 3840 ['conv4_block22_concat[0][0]']
lization)
conv4_block23_0_relu (Activati (None, 14, 14, 960) 0 ['conv4_block23_0_bn[0][0]']
on)
conv4_block23_1_conv (Conv2D) (None, 14, 14, 128) 122880 ['conv4_block23_0_relu[0][0]']
conv4_block23_1_bn (BatchNorma (None, 14, 14, 128) 512 ['conv4_block23_1_conv[0][0]']
lization)
conv4_block23_1_relu (Activati (None, 14, 14, 128) 0 ['conv4_block23_1_bn[0][0]']
on)
conv4_block23_2_conv (Conv2D) (None, 14, 14, 32) 36864 ['conv4_block23_1_relu[0][0]']
conv4_block23_concat (Concaten (None, 14, 14, 992) 0 ['conv4_block22_concat[0][0]',
ate) 'conv4_block23_2_conv[0][0]']
conv4_block24_0_bn (BatchNorma (None, 14, 14, 992) 3968 ['conv4_block23_concat[0][0]']
lization)
conv4_block24_0_relu (Activati (None, 14, 14, 992) 0 ['conv4_block24_0_bn[0][0]']
on)
conv4_block24_1_conv (Conv2D) (None, 14, 14, 128) 126976 ['conv4_block24_0_relu[0][0]']
conv4_block24_1_bn (BatchNorma (None, 14, 14, 128) 512 ['conv4_block24_1_conv[0][0]']
lization)
conv4_block24_1_relu (Activati (None, 14, 14, 128) 0 ['conv4_block24_1_bn[0][0]']
on)
conv4_block24_2_conv (Conv2D) (None, 14, 14, 32) 36864 ['conv4_block24_1_relu[0][0]']
conv4_block24_concat (Concaten (None, 14, 14, 1024 0 ['conv4_block23_concat[0][0]',
ate) ) 'conv4_block24_2_conv[0][0]']
pool4_bn (BatchNormalization) (None, 14, 14, 1024 4096 ['conv4_block24_concat[0][0]']
)
pool4_relu (Activation) (None, 14, 14, 1024 0 ['pool4_bn[0][0]']
)
pool4_conv (Conv2D) (None, 14, 14, 512) 524288 ['pool4_relu[0][0]']
pool4_pool (AveragePooling2D) (None, 7, 7, 512) 0 ['pool4_conv[0][0]']
conv5_block1_0_bn (BatchNormal (None, 7, 7, 512) 2048 ['pool4_pool[0][0]']
ization)
conv5_block1_0_relu (Activatio (None, 7, 7, 512) 0 ['conv5_block1_0_bn[0][0]']
n)
conv5_block1_1_conv (Conv2D) (None, 7, 7, 128) 65536 ['conv5_block1_0_relu[0][0]']
conv5_block1_1_bn (BatchNormal (None, 7, 7, 128) 512 ['conv5_block1_1_conv[0][0]']
ization)
conv5_block1_1_relu (Activatio (None, 7, 7, 128) 0 ['conv5_block1_1_bn[0][0]']
n)
conv5_block1_2_conv (Conv2D) (None, 7, 7, 32) 36864 ['conv5_block1_1_relu[0][0]']
conv5_block1_concat (Concatena (None, 7, 7, 544) 0 ['pool4_pool[0][0]',
te) 'conv5_block1_2_conv[0][0]']
conv5_block2_0_bn (BatchNormal (None, 7, 7, 544) 2176 ['conv5_block1_concat[0][0]']
ization)
conv5_block2_0_relu (Activatio (None, 7, 7, 544) 0 ['conv5_block2_0_bn[0][0]']
n)
conv5_block2_1_conv (Conv2D) (None, 7, 7, 128) 69632 ['conv5_block2_0_relu[0][0]']
conv5_block2_1_bn (BatchNormal (None, 7, 7, 128) 512 ['conv5_block2_1_conv[0][0]']
ization)
conv5_block2_1_relu (Activatio (None, 7, 7, 128) 0 ['conv5_block2_1_bn[0][0]']
n)
conv5_block2_2_conv (Conv2D) (None, 7, 7, 32) 36864 ['conv5_block2_1_relu[0][0]']
conv5_block2_concat (Concatena (None, 7, 7, 576) 0 ['conv5_block1_concat[0][0]',
te) 'conv5_block2_2_conv[0][0]']
conv5_block3_0_bn (BatchNormal (None, 7, 7, 576) 2304 ['conv5_block2_concat[0][0]']
ization)
conv5_block3_0_relu (Activatio (None, 7, 7, 576) 0 ['conv5_block3_0_bn[0][0]']
n)
conv5_block3_1_conv (Conv2D) (None, 7, 7, 128) 73728 ['conv5_block3_0_relu[0][0]']
conv5_block3_1_bn (BatchNormal (None, 7, 7, 128) 512 ['conv5_block3_1_conv[0][0]']
ization)
conv5_block3_1_relu (Activatio (None, 7, 7, 128) 0 ['conv5_block3_1_bn[0][0]']
n)
conv5_block3_2_conv (Conv2D) (None, 7, 7, 32) 36864 ['conv5_block3_1_relu[0][0]']
conv5_block3_concat (Concatena (None, 7, 7, 608) 0 ['conv5_block2_concat[0][0]',
te) 'conv5_block3_2_conv[0][0]']
conv5_block4_0_bn (BatchNormal (None, 7, 7, 608) 2432 ['conv5_block3_concat[0][0]']
ization)
conv5_block4_0_relu (Activatio (None, 7, 7, 608) 0 ['conv5_block4_0_bn[0][0]']
n)
conv5_block4_1_conv (Conv2D) (None, 7, 7, 128) 77824 ['conv5_block4_0_relu[0][0]']
conv5_block4_1_bn (BatchNormal (None, 7, 7, 128) 512 ['conv5_block4_1_conv[0][0]']
ization)
conv5_block4_1_relu (Activatio (None, 7, 7, 128) 0 ['conv5_block4_1_bn[0][0]']
n)
conv5_block4_2_conv (Conv2D) (None, 7, 7, 32) 36864 ['conv5_block4_1_relu[0][0]']
conv5_block4_concat (Concatena (None, 7, 7, 640) 0 ['conv5_block3_concat[0][0]',
te) 'conv5_block4_2_conv[0][0]']
conv5_block5_0_bn (BatchNormal (None, 7, 7, 640) 2560 ['conv5_block4_concat[0][0]']
ization)
conv5_block5_0_relu (Activatio (None, 7, 7, 640) 0 ['conv5_block5_0_bn[0][0]']
n)
conv5_block5_1_conv (Conv2D) (None, 7, 7, 128) 81920 ['conv5_block5_0_relu[0][0]']
conv5_block5_1_bn (BatchNormal (None, 7, 7, 128) 512 ['conv5_block5_1_conv[0][0]']
ization)
conv5_block5_1_relu (Activatio (None, 7, 7, 128) 0 ['conv5_block5_1_bn[0][0]']
n)
conv5_block5_2_conv (Conv2D) (None, 7, 7, 32) 36864 ['conv5_block5_1_relu[0][0]']
conv5_block5_concat (Concatena (None, 7, 7, 672) 0 ['conv5_block4_concat[0][0]',
te) 'conv5_block5_2_conv[0][0]']
conv5_block6_0_bn (BatchNormal (None, 7, 7, 672) 2688 ['conv5_block5_concat[0][0]']
ization)
conv5_block6_0_relu (Activatio (None, 7, 7, 672) 0 ['conv5_block6_0_bn[0][0]']
n)
conv5_block6_1_conv (Conv2D) (None, 7, 7, 128) 86016 ['conv5_block6_0_relu[0][0]']
conv5_block6_1_bn (BatchNormal (None, 7, 7, 128) 512 ['conv5_block6_1_conv[0][0]']
ization)
conv5_block6_1_relu (Activatio (None, 7, 7, 128) 0 ['conv5_block6_1_bn[0][0]']
n)
conv5_block6_2_conv (Conv2D) (None, 7, 7, 32) 36864 ['conv5_block6_1_relu[0][0]']
conv5_block6_concat (Concatena (None, 7, 7, 704) 0 ['conv5_block5_concat[0][0]',
te) 'conv5_block6_2_conv[0][0]']
conv5_block7_0_bn (BatchNormal (None, 7, 7, 704) 2816 ['conv5_block6_concat[0][0]']
ization)
conv5_block7_0_relu (Activatio (None, 7, 7, 704) 0 ['conv5_block7_0_bn[0][0]']
n)
conv5_block7_1_conv (Conv2D) (None, 7, 7, 128) 90112 ['conv5_block7_0_relu[0][0]']
conv5_block7_1_bn (BatchNormal (None, 7, 7, 128) 512 ['conv5_block7_1_conv[0][0]']
ization)
conv5_block7_1_relu (Activatio (None, 7, 7, 128) 0 ['conv5_block7_1_bn[0][0]']
n)
conv5_block7_2_conv (Conv2D) (None, 7, 7, 32) 36864 ['conv5_block7_1_relu[0][0]']
conv5_block7_concat (Concatena (None, 7, 7, 736) 0 ['conv5_block6_concat[0][0]',
te) 'conv5_block7_2_conv[0][0]']
conv5_block8_0_bn (BatchNormal (None, 7, 7, 736) 2944 ['conv5_block7_concat[0][0]']
ization)
conv5_block8_0_relu (Activatio (None, 7, 7, 736) 0 ['conv5_block8_0_bn[0][0]']
n)
conv5_block8_1_conv (Conv2D) (None, 7, 7, 128) 94208 ['conv5_block8_0_relu[0][0]']
conv5_block8_1_bn (BatchNormal (None, 7, 7, 128) 512 ['conv5_block8_1_conv[0][0]']
ization)
conv5_block8_1_relu (Activatio (None, 7, 7, 128) 0 ['conv5_block8_1_bn[0][0]']
n)
conv5_block8_2_conv (Conv2D) (None, 7, 7, 32) 36864 ['conv5_block8_1_relu[0][0]']
conv5_block8_concat (Concatena (None, 7, 7, 768) 0 ['conv5_block7_concat[0][0]',
te) 'conv5_block8_2_conv[0][0]']
conv5_block9_0_bn (BatchNormal (None, 7, 7, 768) 3072 ['conv5_block8_concat[0][0]']
ization)
conv5_block9_0_relu (Activatio (None, 7, 7, 768) 0 ['conv5_block9_0_bn[0][0]']
n)
conv5_block9_1_conv (Conv2D) (None, 7, 7, 128) 98304 ['conv5_block9_0_relu[0][0]']
conv5_block9_1_bn (BatchNormal (None, 7, 7, 128) 512 ['conv5_block9_1_conv[0][0]']
ization)
conv5_block9_1_relu (Activatio (None, 7, 7, 128) 0 ['conv5_block9_1_bn[0][0]']
n)
conv5_block9_2_conv (Conv2D) (None, 7, 7, 32) 36864 ['conv5_block9_1_relu[0][0]']
conv5_block9_concat (Concatena (None, 7, 7, 800) 0 ['conv5_block8_concat[0][0]',
te) 'conv5_block9_2_conv[0][0]']
conv5_block10_0_bn (BatchNorma (None, 7, 7, 800) 3200 ['conv5_block9_concat[0][0]']
lization)
conv5_block10_0_relu (Activati (None, 7, 7, 800) 0 ['conv5_block10_0_bn[0][0]']
on)
conv5_block10_1_conv (Conv2D) (None, 7, 7, 128) 102400 ['conv5_block10_0_relu[0][0]']
conv5_block10_1_bn (BatchNorma (None, 7, 7, 128) 512 ['conv5_block10_1_conv[0][0]']
lization)
conv5_block10_1_relu (Activati (None, 7, 7, 128) 0 ['conv5_block10_1_bn[0][0]']
on)
conv5_block10_2_conv (Conv2D) (None, 7, 7, 32) 36864 ['conv5_block10_1_relu[0][0]']
conv5_block10_concat (Concaten (None, 7, 7, 832) 0 ['conv5_block9_concat[0][0]',
ate) 'conv5_block10_2_conv[0][0]']
conv5_block11_0_bn (BatchNorma (None, 7, 7, 832) 3328 ['conv5_block10_concat[0][0]']
lization)
conv5_block11_0_relu (Activati (None, 7, 7, 832) 0 ['conv5_block11_0_bn[0][0]']
on)
conv5_block11_1_conv (Conv2D) (None, 7, 7, 128) 106496 ['conv5_block11_0_relu[0][0]']
conv5_block11_1_bn (BatchNorma (None, 7, 7, 128) 512 ['conv5_block11_1_conv[0][0]']
lization)
conv5_block11_1_relu (Activati (None, 7, 7, 128) 0 ['conv5_block11_1_bn[0][0]']
on)
conv5_block11_2_conv (Conv2D) (None, 7, 7, 32) 36864 ['conv5_block11_1_relu[0][0]']
conv5_block11_concat (Concaten (None, 7, 7, 864) 0 ['conv5_block10_concat[0][0]',
ate) 'conv5_block11_2_conv[0][0]']
conv5_block12_0_bn (BatchNorma (None, 7, 7, 864) 3456 ['conv5_block11_concat[0][0]']
lization)
conv5_block12_0_relu (Activati (None, 7, 7, 864) 0 ['conv5_block12_0_bn[0][0]']
on)
conv5_block12_1_conv (Conv2D) (None, 7, 7, 128) 110592 ['conv5_block12_0_relu[0][0]']
conv5_block12_1_bn (BatchNorma (None, 7, 7, 128) 512 ['conv5_block12_1_conv[0][0]']
lization)
conv5_block12_1_relu (Activati (None, 7, 7, 128) 0 ['conv5_block12_1_bn[0][0]']
on)
conv5_block12_2_conv (Conv2D) (None, 7, 7, 32) 36864 ['conv5_block12_1_relu[0][0]']
conv5_block12_concat (Concaten (None, 7, 7, 896) 0 ['conv5_block11_concat[0][0]',
ate) 'conv5_block12_2_conv[0][0]']
conv5_block13_0_bn (BatchNorma (None, 7, 7, 896) 3584 ['conv5_block12_concat[0][0]']
lization)
conv5_block13_0_relu (Activati (None, 7, 7, 896) 0 ['conv5_block13_0_bn[0][0]']
on)
conv5_block13_1_conv (Conv2D) (None, 7, 7, 128) 114688 ['conv5_block13_0_relu[0][0]']
conv5_block13_1_bn (BatchNorma (None, 7, 7, 128) 512 ['conv5_block13_1_conv[0][0]']
lization)
conv5_block13_1_relu (Activati (None, 7, 7, 128) 0 ['conv5_block13_1_bn[0][0]']
on)
conv5_block13_2_conv (Conv2D) (None, 7, 7, 32) 36864 ['conv5_block13_1_relu[0][0]']
conv5_block13_concat (Concaten (None, 7, 7, 928) 0 ['conv5_block12_concat[0][0]',
ate) 'conv5_block13_2_conv[0][0]']
conv5_block14_0_bn (BatchNorma (None, 7, 7, 928) 3712 ['conv5_block13_concat[0][0]']
lization)
conv5_block14_0_relu (Activati (None, 7, 7, 928) 0 ['conv5_block14_0_bn[0][0]']
on)
conv5_block14_1_conv (Conv2D) (None, 7, 7, 128) 118784 ['conv5_block14_0_relu[0][0]']
conv5_block14_1_bn (BatchNorma (None, 7, 7, 128) 512 ['conv5_block14_1_conv[0][0]']
lization)
conv5_block14_1_relu (Activati (None, 7, 7, 128) 0 ['conv5_block14_1_bn[0][0]']
on)
conv5_block14_2_conv (Conv2D) (None, 7, 7, 32) 36864 ['conv5_block14_1_relu[0][0]']
conv5_block14_concat (Concaten (None, 7, 7, 960) 0 ['conv5_block13_concat[0][0]',
ate) 'conv5_block14_2_conv[0][0]']
conv5_block15_0_bn (BatchNorma (None, 7, 7, 960) 3840 ['conv5_block14_concat[0][0]']
lization)
conv5_block15_0_relu (Activati (None, 7, 7, 960) 0 ['conv5_block15_0_bn[0][0]']
on)
conv5_block15_1_conv (Conv2D) (None, 7, 7, 128) 122880 ['conv5_block15_0_relu[0][0]']
conv5_block15_1_bn (BatchNorma (None, 7, 7, 128) 512 ['conv5_block15_1_conv[0][0]']
lization)
conv5_block15_1_relu (Activati (None, 7, 7, 128) 0 ['conv5_block15_1_bn[0][0]']
on)
conv5_block15_2_conv (Conv2D) (None, 7, 7, 32) 36864 ['conv5_block15_1_relu[0][0]']
conv5_block15_concat (Concaten (None, 7, 7, 992) 0 ['conv5_block14_concat[0][0]',
ate) 'conv5_block15_2_conv[0][0]']
conv5_block16_0_bn (BatchNorma (None, 7, 7, 992) 3968 ['conv5_block15_concat[0][0]']
lization)
conv5_block16_0_relu (Activati (None, 7, 7, 992) 0 ['conv5_block16_0_bn[0][0]']
on)
conv5_block16_1_conv (Conv2D) (None, 7, 7, 128) 126976 ['conv5_block16_0_relu[0][0]']
conv5_block16_1_bn (BatchNorma (None, 7, 7, 128) 512 ['conv5_block16_1_conv[0][0]']
lization)
conv5_block16_1_relu (Activati (None, 7, 7, 128) 0 ['conv5_block16_1_bn[0][0]']
on)
conv5_block16_2_conv (Conv2D) (None, 7, 7, 32) 36864 ['conv5_block16_1_relu[0][0]']
conv5_block16_concat (Concaten (None, 7, 7, 1024) 0 ['conv5_block15_concat[0][0]',
ate) 'conv5_block16_2_conv[0][0]']
bn (BatchNormalization) (None, 7, 7, 1024) 4096 ['conv5_block16_concat[0][0]']
relu (Activation) (None, 7, 7, 1024) 0 ['bn[0][0]']
avg_pool (GlobalAveragePooling (None, 1024) 0 ['relu[0][0]']
2D)
==================================================================================================
Total params: 7,037,504
Trainable params: 0
Non-trainable params: 7,037,504
__________________________________________________________________________________________________
tf.keras.backend.clear_session()
model_dense_1 = Sequential()
model_dense_1.add(dense_base_model)
model_dense_1.add(Flatten())
model_dense_1.add(BatchNormalization())
model_dense_1.add(Dense(1024, activation='relu'))
model_dense_1.add(Dropout(0.6))
model_dense_1.add(BatchNormalization())
# model_dense_1.add(Dense(128, activation='relu'))
# model_dense_1.add(Dropout(0.5))
# model_dense_1.add(BatchNormalization())
# model_dense_1.add(Dense(64, activation='relu'))
# model_dense_1.add(Dropout(0.3))
model_dense_1.add(Dense(2, activation='sigmoid'))
model_dense_1.summary()
Model: "sequential"
_________________________________________________________________
Layer (type) Output Shape Param #
=================================================================
densenet121 (Functional) (None, 1024) 7037504
flatten (Flatten) (None, 1024) 0
batch_normalization (BatchN (None, 1024) 4096
ormalization)
dense (Dense) (None, 1024) 1049600
dropout (Dropout) (None, 1024) 0
batch_normalization_1 (Batc (None, 1024) 4096
hNormalization)
dense_1 (Dense) (None, 2) 2050
=================================================================
Total params: 8,097,346
Trainable params: 1,055,746
Non-trainable params: 7,041,600
_________________________________________________________________
optimizer = Adam(learning_rate=0.0001, beta_1=0.9, beta_2=0.999, epsilon=1e-07)
model_dense_1.compile(optimizer=optimizer, loss='binary_crossentropy', metrics=['acc'])
batch_size = 60
nb_epochs = 12
model_dense_hist_1 = model_dense_1.fit(X_train1, y_train1,
batch_size=batch_size,
epochs=nb_epochs,
validation_data=(X_val, y_val),
initial_epoch=0, callbacks=[EarlyStopping(monitor='val_loss', patience=2, min_delta=0.001)])
Epoch 1/12 64/64 [==============================] - 298s 5s/step - loss: 0.7282 - acc: 0.6596 - val_loss: 0.7735 - val_acc: 0.6104 Epoch 2/12 64/64 [==============================] - 322s 5s/step - loss: 0.6637 - acc: 0.7036 - val_loss: 0.5831 - val_acc: 0.7073 Epoch 3/12 64/64 [==============================] - 343s 5s/step - loss: 0.6419 - acc: 0.7193 - val_loss: 0.5654 - val_acc: 0.7271 Epoch 4/12 64/64 [==============================] - 384s 6s/step - loss: 0.6205 - acc: 0.7271 - val_loss: 0.5502 - val_acc: 0.7406 Epoch 5/12 64/64 [==============================] - 397s 6s/step - loss: 0.6024 - acc: 0.7318 - val_loss: 0.5477 - val_acc: 0.7406 Epoch 6/12 64/64 [==============================] - 311s 5s/step - loss: 0.5921 - acc: 0.7471 - val_loss: 0.5524 - val_acc: 0.7240 Epoch 7/12 64/64 [==============================] - 309s 5s/step - loss: 0.5812 - acc: 0.7346 - val_loss: 0.5460 - val_acc: 0.7281 Epoch 8/12 64/64 [==============================] - 311s 5s/step - loss: 0.5608 - acc: 0.7612 - val_loss: 0.5534 - val_acc: 0.7312 Epoch 9/12 64/64 [==============================] - 308s 5s/step - loss: 0.5663 - acc: 0.7492 - val_loss: 0.5455 - val_acc: 0.7240
fig1 = plt.gcf()
plt.plot(model_dense_hist_1.history['acc'])
plt.plot(model_dense_hist_1.history['val_acc'])
plt.axis(ymin=0.4,ymax=1)
plt.grid()
plt.title('Model Accuracy')
plt.ylabel('Accuracy')
plt.xlabel('Epochs')
plt.legend(['train', 'validation'])
plt.show()
fig1 = plt.gcf()
plt.plot(model_dense_hist_1.history['loss'])
plt.plot(model_dense_hist_1.history['val_loss'])
plt.axis(ymin=0.4,ymax=1)
plt.grid()
plt.title('Model Loss')
plt.ylabel('Loss')
plt.xlabel('Epochs')
plt.legend(['train', 'validation'])
plt.show()
loss_traindn1, accuracy_traindn1 = model_dense_1.evaluate(X_train1,y_train1)
loss_valdn1, accuracy_valdn1 = model_dense_1.evaluate(X_val,y_val)
loss_testdn1, accuracy_testdn1 = model_dense_1.evaluate(X_test,y_test)
120/120 [==============================] - 261s 2s/step - loss: 0.4536 - acc: 0.7833 30/30 [==============================] - 69s 2s/step - loss: 0.5455 - acc: 0.7240 38/38 [==============================] - 86s 2s/step - loss: 0.5565 - acc: 0.7308
y_pred_dn1 = model_dense_1.predict(X_test)
38/38 [==============================] - 81s 2s/step
y_test_arg_dn1 = np.argmax(y_test,axis=1)
y_pred_dn1 = np.argmax(y_pred_dn1,axis=1)
print(confusion_matrix(y_test_arg_dn1, y_pred_dn1))
[[434 166] [157 443]]
Accuracy =metrics.accuracy_score(y_test_arg_dn1,y_pred_dn1)
Precision=metrics.precision_score(y_test_arg_dn1, y_pred_dn1)
Recall =metrics.recall_score(y_test_arg_dn1, y_pred_dn1)
F1_score =metrics.f1_score(y_test_arg_dn1, y_pred_dn1)
ROC =metrics.roc_auc_score(y_test_arg_dn1, y_pred_dn1)
dn1_series = pd.Series(["DENSENET121", Accuracy, Precision, Recall, F1_score, ROC])
Pre_trained_results = Pre_trained_results.append(dn1_series, ignore_index = True)
print("DENSENET121 - Accuracy: ",Accuracy)
print("DENSENET121 - Precision: ",Precision)
print("DENSENET121 - Recall: ",Recall)
print("DENSENET121 - F1 score: ",F1_score)
print("DENSENET121 - ROC: ",ROC)
DENSENET121 - Accuracy: 0.7308333333333333 DENSENET121 - Precision: 0.7274220032840722 DENSENET121 - Recall: 0.7383333333333333 DENSENET121 - F1 score: 0.7328370554177005 DENSENET121 - ROC: 0.7308333333333333
cm = confusion_matrix(y_test_arg_dn1, y_pred_dn1)
labels = ["0 - No Pneumonia", "1 - Pneumonia"]
disp = ConfusionMatrixDisplay(confusion_matrix=cm, display_labels=labels)
disp.plot()
plt.show()
weights_path = "brucechou1983_CheXNet_Keras_0.3.0_weights.h5"
def build_model():
base_model = tf.keras.applications.DenseNet121(weights=None,
include_top=False,
input_shape=(224,224,3), pooling="avg")
# freezing initial layers of base model
#for layer in base_model.layers:
# layer.trainable = False
base_model.load_weights(weights_path,by_name= True)
x = tf.keras.layers.Dense(units = 256, activation = 'relu')(base_model.output)
x = tf.keras.layers.BatchNormalization()(x)
x = tf.keras.activations.relu(x)
x = tf.keras.layers.Dropout(rate = 0.1)(x)
x = tf.keras.layers.Dense(units = 128)(x)
x = tf.keras.layers.BatchNormalization()(x)
x = tf.keras.activations.relu(x)
predictions = tf.keras.layers.Dense(2, activation='softmax', name='predictions')(x)
model = tf.keras.Model(inputs= base_model.input, outputs=predictions)
learning_rate = 4.8637e-5
#adamw = tfa.optimizers.AdamW(weight_decay= wd,learning_rate=learning_rate)
optimizer = tf.keras.optimizers.Adam(learning_rate)
model.compile(optimizer=optimizer,
loss=tf.keras.losses.BinaryCrossentropy(from_logits=True),
metrics=[tf.keras.metrics.AUC(),
tf.keras.metrics.Precision(),
tf.keras.metrics.Recall()])
return model
new_dense_model = build_model()
new_dense_model.metrics
[]
new_dense_model.summary()
Model: "model"
__________________________________________________________________________________________________
Layer (type) Output Shape Param # Connected to
==================================================================================================
input_1 (InputLayer) [(None, 224, 224, 3 0 []
)]
zero_padding2d (ZeroPadding2D) (None, 230, 230, 3) 0 ['input_1[0][0]']
conv1/conv (Conv2D) (None, 112, 112, 64 9408 ['zero_padding2d[0][0]']
)
conv1/bn (BatchNormalization) (None, 112, 112, 64 256 ['conv1/conv[0][0]']
)
conv1/relu (Activation) (None, 112, 112, 64 0 ['conv1/bn[0][0]']
)
zero_padding2d_1 (ZeroPadding2 (None, 114, 114, 64 0 ['conv1/relu[0][0]']
D) )
pool1 (MaxPooling2D) (None, 56, 56, 64) 0 ['zero_padding2d_1[0][0]']
conv2_block1_0_bn (BatchNormal (None, 56, 56, 64) 256 ['pool1[0][0]']
ization)
conv2_block1_0_relu (Activatio (None, 56, 56, 64) 0 ['conv2_block1_0_bn[0][0]']
n)
conv2_block1_1_conv (Conv2D) (None, 56, 56, 128) 8192 ['conv2_block1_0_relu[0][0]']
conv2_block1_1_bn (BatchNormal (None, 56, 56, 128) 512 ['conv2_block1_1_conv[0][0]']
ization)
conv2_block1_1_relu (Activatio (None, 56, 56, 128) 0 ['conv2_block1_1_bn[0][0]']
n)
conv2_block1_2_conv (Conv2D) (None, 56, 56, 32) 36864 ['conv2_block1_1_relu[0][0]']
conv2_block1_concat (Concatena (None, 56, 56, 96) 0 ['pool1[0][0]',
te) 'conv2_block1_2_conv[0][0]']
conv2_block2_0_bn (BatchNormal (None, 56, 56, 96) 384 ['conv2_block1_concat[0][0]']
ization)
conv2_block2_0_relu (Activatio (None, 56, 56, 96) 0 ['conv2_block2_0_bn[0][0]']
n)
conv2_block2_1_conv (Conv2D) (None, 56, 56, 128) 12288 ['conv2_block2_0_relu[0][0]']
conv2_block2_1_bn (BatchNormal (None, 56, 56, 128) 512 ['conv2_block2_1_conv[0][0]']
ization)
conv2_block2_1_relu (Activatio (None, 56, 56, 128) 0 ['conv2_block2_1_bn[0][0]']
n)
conv2_block2_2_conv (Conv2D) (None, 56, 56, 32) 36864 ['conv2_block2_1_relu[0][0]']
conv2_block2_concat (Concatena (None, 56, 56, 128) 0 ['conv2_block1_concat[0][0]',
te) 'conv2_block2_2_conv[0][0]']
conv2_block3_0_bn (BatchNormal (None, 56, 56, 128) 512 ['conv2_block2_concat[0][0]']
ization)
conv2_block3_0_relu (Activatio (None, 56, 56, 128) 0 ['conv2_block3_0_bn[0][0]']
n)
conv2_block3_1_conv (Conv2D) (None, 56, 56, 128) 16384 ['conv2_block3_0_relu[0][0]']
conv2_block3_1_bn (BatchNormal (None, 56, 56, 128) 512 ['conv2_block3_1_conv[0][0]']
ization)
conv2_block3_1_relu (Activatio (None, 56, 56, 128) 0 ['conv2_block3_1_bn[0][0]']
n)
conv2_block3_2_conv (Conv2D) (None, 56, 56, 32) 36864 ['conv2_block3_1_relu[0][0]']
conv2_block3_concat (Concatena (None, 56, 56, 160) 0 ['conv2_block2_concat[0][0]',
te) 'conv2_block3_2_conv[0][0]']
conv2_block4_0_bn (BatchNormal (None, 56, 56, 160) 640 ['conv2_block3_concat[0][0]']
ization)
conv2_block4_0_relu (Activatio (None, 56, 56, 160) 0 ['conv2_block4_0_bn[0][0]']
n)
conv2_block4_1_conv (Conv2D) (None, 56, 56, 128) 20480 ['conv2_block4_0_relu[0][0]']
conv2_block4_1_bn (BatchNormal (None, 56, 56, 128) 512 ['conv2_block4_1_conv[0][0]']
ization)
conv2_block4_1_relu (Activatio (None, 56, 56, 128) 0 ['conv2_block4_1_bn[0][0]']
n)
conv2_block4_2_conv (Conv2D) (None, 56, 56, 32) 36864 ['conv2_block4_1_relu[0][0]']
conv2_block4_concat (Concatena (None, 56, 56, 192) 0 ['conv2_block3_concat[0][0]',
te) 'conv2_block4_2_conv[0][0]']
conv2_block5_0_bn (BatchNormal (None, 56, 56, 192) 768 ['conv2_block4_concat[0][0]']
ization)
conv2_block5_0_relu (Activatio (None, 56, 56, 192) 0 ['conv2_block5_0_bn[0][0]']
n)
conv2_block5_1_conv (Conv2D) (None, 56, 56, 128) 24576 ['conv2_block5_0_relu[0][0]']
conv2_block5_1_bn (BatchNormal (None, 56, 56, 128) 512 ['conv2_block5_1_conv[0][0]']
ization)
conv2_block5_1_relu (Activatio (None, 56, 56, 128) 0 ['conv2_block5_1_bn[0][0]']
n)
conv2_block5_2_conv (Conv2D) (None, 56, 56, 32) 36864 ['conv2_block5_1_relu[0][0]']
conv2_block5_concat (Concatena (None, 56, 56, 224) 0 ['conv2_block4_concat[0][0]',
te) 'conv2_block5_2_conv[0][0]']
conv2_block6_0_bn (BatchNormal (None, 56, 56, 224) 896 ['conv2_block5_concat[0][0]']
ization)
conv2_block6_0_relu (Activatio (None, 56, 56, 224) 0 ['conv2_block6_0_bn[0][0]']
n)
conv2_block6_1_conv (Conv2D) (None, 56, 56, 128) 28672 ['conv2_block6_0_relu[0][0]']
conv2_block6_1_bn (BatchNormal (None, 56, 56, 128) 512 ['conv2_block6_1_conv[0][0]']
ization)
conv2_block6_1_relu (Activatio (None, 56, 56, 128) 0 ['conv2_block6_1_bn[0][0]']
n)
conv2_block6_2_conv (Conv2D) (None, 56, 56, 32) 36864 ['conv2_block6_1_relu[0][0]']
conv2_block6_concat (Concatena (None, 56, 56, 256) 0 ['conv2_block5_concat[0][0]',
te) 'conv2_block6_2_conv[0][0]']
pool2_bn (BatchNormalization) (None, 56, 56, 256) 1024 ['conv2_block6_concat[0][0]']
pool2_relu (Activation) (None, 56, 56, 256) 0 ['pool2_bn[0][0]']
pool2_conv (Conv2D) (None, 56, 56, 128) 32768 ['pool2_relu[0][0]']
pool2_pool (AveragePooling2D) (None, 28, 28, 128) 0 ['pool2_conv[0][0]']
conv3_block1_0_bn (BatchNormal (None, 28, 28, 128) 512 ['pool2_pool[0][0]']
ization)
conv3_block1_0_relu (Activatio (None, 28, 28, 128) 0 ['conv3_block1_0_bn[0][0]']
n)
conv3_block1_1_conv (Conv2D) (None, 28, 28, 128) 16384 ['conv3_block1_0_relu[0][0]']
conv3_block1_1_bn (BatchNormal (None, 28, 28, 128) 512 ['conv3_block1_1_conv[0][0]']
ization)
conv3_block1_1_relu (Activatio (None, 28, 28, 128) 0 ['conv3_block1_1_bn[0][0]']
n)
conv3_block1_2_conv (Conv2D) (None, 28, 28, 32) 36864 ['conv3_block1_1_relu[0][0]']
conv3_block1_concat (Concatena (None, 28, 28, 160) 0 ['pool2_pool[0][0]',
te) 'conv3_block1_2_conv[0][0]']
conv3_block2_0_bn (BatchNormal (None, 28, 28, 160) 640 ['conv3_block1_concat[0][0]']
ization)
conv3_block2_0_relu (Activatio (None, 28, 28, 160) 0 ['conv3_block2_0_bn[0][0]']
n)
conv3_block2_1_conv (Conv2D) (None, 28, 28, 128) 20480 ['conv3_block2_0_relu[0][0]']
conv3_block2_1_bn (BatchNormal (None, 28, 28, 128) 512 ['conv3_block2_1_conv[0][0]']
ization)
conv3_block2_1_relu (Activatio (None, 28, 28, 128) 0 ['conv3_block2_1_bn[0][0]']
n)
conv3_block2_2_conv (Conv2D) (None, 28, 28, 32) 36864 ['conv3_block2_1_relu[0][0]']
conv3_block2_concat (Concatena (None, 28, 28, 192) 0 ['conv3_block1_concat[0][0]',
te) 'conv3_block2_2_conv[0][0]']
conv3_block3_0_bn (BatchNormal (None, 28, 28, 192) 768 ['conv3_block2_concat[0][0]']
ization)
conv3_block3_0_relu (Activatio (None, 28, 28, 192) 0 ['conv3_block3_0_bn[0][0]']
n)
conv3_block3_1_conv (Conv2D) (None, 28, 28, 128) 24576 ['conv3_block3_0_relu[0][0]']
conv3_block3_1_bn (BatchNormal (None, 28, 28, 128) 512 ['conv3_block3_1_conv[0][0]']
ization)
conv3_block3_1_relu (Activatio (None, 28, 28, 128) 0 ['conv3_block3_1_bn[0][0]']
n)
conv3_block3_2_conv (Conv2D) (None, 28, 28, 32) 36864 ['conv3_block3_1_relu[0][0]']
conv3_block3_concat (Concatena (None, 28, 28, 224) 0 ['conv3_block2_concat[0][0]',
te) 'conv3_block3_2_conv[0][0]']
conv3_block4_0_bn (BatchNormal (None, 28, 28, 224) 896 ['conv3_block3_concat[0][0]']
ization)
conv3_block4_0_relu (Activatio (None, 28, 28, 224) 0 ['conv3_block4_0_bn[0][0]']
n)
conv3_block4_1_conv (Conv2D) (None, 28, 28, 128) 28672 ['conv3_block4_0_relu[0][0]']
conv3_block4_1_bn (BatchNormal (None, 28, 28, 128) 512 ['conv3_block4_1_conv[0][0]']
ization)
conv3_block4_1_relu (Activatio (None, 28, 28, 128) 0 ['conv3_block4_1_bn[0][0]']
n)
conv3_block4_2_conv (Conv2D) (None, 28, 28, 32) 36864 ['conv3_block4_1_relu[0][0]']
conv3_block4_concat (Concatena (None, 28, 28, 256) 0 ['conv3_block3_concat[0][0]',
te) 'conv3_block4_2_conv[0][0]']
conv3_block5_0_bn (BatchNormal (None, 28, 28, 256) 1024 ['conv3_block4_concat[0][0]']
ization)
conv3_block5_0_relu (Activatio (None, 28, 28, 256) 0 ['conv3_block5_0_bn[0][0]']
n)
conv3_block5_1_conv (Conv2D) (None, 28, 28, 128) 32768 ['conv3_block5_0_relu[0][0]']
conv3_block5_1_bn (BatchNormal (None, 28, 28, 128) 512 ['conv3_block5_1_conv[0][0]']
ization)
conv3_block5_1_relu (Activatio (None, 28, 28, 128) 0 ['conv3_block5_1_bn[0][0]']
n)
conv3_block5_2_conv (Conv2D) (None, 28, 28, 32) 36864 ['conv3_block5_1_relu[0][0]']
conv3_block5_concat (Concatena (None, 28, 28, 288) 0 ['conv3_block4_concat[0][0]',
te) 'conv3_block5_2_conv[0][0]']
conv3_block6_0_bn (BatchNormal (None, 28, 28, 288) 1152 ['conv3_block5_concat[0][0]']
ization)
conv3_block6_0_relu (Activatio (None, 28, 28, 288) 0 ['conv3_block6_0_bn[0][0]']
n)
conv3_block6_1_conv (Conv2D) (None, 28, 28, 128) 36864 ['conv3_block6_0_relu[0][0]']
conv3_block6_1_bn (BatchNormal (None, 28, 28, 128) 512 ['conv3_block6_1_conv[0][0]']
ization)
conv3_block6_1_relu (Activatio (None, 28, 28, 128) 0 ['conv3_block6_1_bn[0][0]']
n)
conv3_block6_2_conv (Conv2D) (None, 28, 28, 32) 36864 ['conv3_block6_1_relu[0][0]']
conv3_block6_concat (Concatena (None, 28, 28, 320) 0 ['conv3_block5_concat[0][0]',
te) 'conv3_block6_2_conv[0][0]']
conv3_block7_0_bn (BatchNormal (None, 28, 28, 320) 1280 ['conv3_block6_concat[0][0]']
ization)
conv3_block7_0_relu (Activatio (None, 28, 28, 320) 0 ['conv3_block7_0_bn[0][0]']
n)
conv3_block7_1_conv (Conv2D) (None, 28, 28, 128) 40960 ['conv3_block7_0_relu[0][0]']
conv3_block7_1_bn (BatchNormal (None, 28, 28, 128) 512 ['conv3_block7_1_conv[0][0]']
ization)
conv3_block7_1_relu (Activatio (None, 28, 28, 128) 0 ['conv3_block7_1_bn[0][0]']
n)
conv3_block7_2_conv (Conv2D) (None, 28, 28, 32) 36864 ['conv3_block7_1_relu[0][0]']
conv3_block7_concat (Concatena (None, 28, 28, 352) 0 ['conv3_block6_concat[0][0]',
te) 'conv3_block7_2_conv[0][0]']
conv3_block8_0_bn (BatchNormal (None, 28, 28, 352) 1408 ['conv3_block7_concat[0][0]']
ization)
conv3_block8_0_relu (Activatio (None, 28, 28, 352) 0 ['conv3_block8_0_bn[0][0]']
n)
conv3_block8_1_conv (Conv2D) (None, 28, 28, 128) 45056 ['conv3_block8_0_relu[0][0]']
conv3_block8_1_bn (BatchNormal (None, 28, 28, 128) 512 ['conv3_block8_1_conv[0][0]']
ization)
conv3_block8_1_relu (Activatio (None, 28, 28, 128) 0 ['conv3_block8_1_bn[0][0]']
n)
conv3_block8_2_conv (Conv2D) (None, 28, 28, 32) 36864 ['conv3_block8_1_relu[0][0]']
conv3_block8_concat (Concatena (None, 28, 28, 384) 0 ['conv3_block7_concat[0][0]',
te) 'conv3_block8_2_conv[0][0]']
conv3_block9_0_bn (BatchNormal (None, 28, 28, 384) 1536 ['conv3_block8_concat[0][0]']
ization)
conv3_block9_0_relu (Activatio (None, 28, 28, 384) 0 ['conv3_block9_0_bn[0][0]']
n)
conv3_block9_1_conv (Conv2D) (None, 28, 28, 128) 49152 ['conv3_block9_0_relu[0][0]']
conv3_block9_1_bn (BatchNormal (None, 28, 28, 128) 512 ['conv3_block9_1_conv[0][0]']
ization)
conv3_block9_1_relu (Activatio (None, 28, 28, 128) 0 ['conv3_block9_1_bn[0][0]']
n)
conv3_block9_2_conv (Conv2D) (None, 28, 28, 32) 36864 ['conv3_block9_1_relu[0][0]']
conv3_block9_concat (Concatena (None, 28, 28, 416) 0 ['conv3_block8_concat[0][0]',
te) 'conv3_block9_2_conv[0][0]']
conv3_block10_0_bn (BatchNorma (None, 28, 28, 416) 1664 ['conv3_block9_concat[0][0]']
lization)
conv3_block10_0_relu (Activati (None, 28, 28, 416) 0 ['conv3_block10_0_bn[0][0]']
on)
conv3_block10_1_conv (Conv2D) (None, 28, 28, 128) 53248 ['conv3_block10_0_relu[0][0]']
conv3_block10_1_bn (BatchNorma (None, 28, 28, 128) 512 ['conv3_block10_1_conv[0][0]']
lization)
conv3_block10_1_relu (Activati (None, 28, 28, 128) 0 ['conv3_block10_1_bn[0][0]']
on)
conv3_block10_2_conv (Conv2D) (None, 28, 28, 32) 36864 ['conv3_block10_1_relu[0][0]']
conv3_block10_concat (Concaten (None, 28, 28, 448) 0 ['conv3_block9_concat[0][0]',
ate) 'conv3_block10_2_conv[0][0]']
conv3_block11_0_bn (BatchNorma (None, 28, 28, 448) 1792 ['conv3_block10_concat[0][0]']
lization)
conv3_block11_0_relu (Activati (None, 28, 28, 448) 0 ['conv3_block11_0_bn[0][0]']
on)
conv3_block11_1_conv (Conv2D) (None, 28, 28, 128) 57344 ['conv3_block11_0_relu[0][0]']
conv3_block11_1_bn (BatchNorma (None, 28, 28, 128) 512 ['conv3_block11_1_conv[0][0]']
lization)
conv3_block11_1_relu (Activati (None, 28, 28, 128) 0 ['conv3_block11_1_bn[0][0]']
on)
conv3_block11_2_conv (Conv2D) (None, 28, 28, 32) 36864 ['conv3_block11_1_relu[0][0]']
conv3_block11_concat (Concaten (None, 28, 28, 480) 0 ['conv3_block10_concat[0][0]',
ate) 'conv3_block11_2_conv[0][0]']
conv3_block12_0_bn (BatchNorma (None, 28, 28, 480) 1920 ['conv3_block11_concat[0][0]']
lization)
conv3_block12_0_relu (Activati (None, 28, 28, 480) 0 ['conv3_block12_0_bn[0][0]']
on)
conv3_block12_1_conv (Conv2D) (None, 28, 28, 128) 61440 ['conv3_block12_0_relu[0][0]']
conv3_block12_1_bn (BatchNorma (None, 28, 28, 128) 512 ['conv3_block12_1_conv[0][0]']
lization)
conv3_block12_1_relu (Activati (None, 28, 28, 128) 0 ['conv3_block12_1_bn[0][0]']
on)
conv3_block12_2_conv (Conv2D) (None, 28, 28, 32) 36864 ['conv3_block12_1_relu[0][0]']
conv3_block12_concat (Concaten (None, 28, 28, 512) 0 ['conv3_block11_concat[0][0]',
ate) 'conv3_block12_2_conv[0][0]']
pool3_bn (BatchNormalization) (None, 28, 28, 512) 2048 ['conv3_block12_concat[0][0]']
pool3_relu (Activation) (None, 28, 28, 512) 0 ['pool3_bn[0][0]']
pool3_conv (Conv2D) (None, 28, 28, 256) 131072 ['pool3_relu[0][0]']
pool3_pool (AveragePooling2D) (None, 14, 14, 256) 0 ['pool3_conv[0][0]']
conv4_block1_0_bn (BatchNormal (None, 14, 14, 256) 1024 ['pool3_pool[0][0]']
ization)
conv4_block1_0_relu (Activatio (None, 14, 14, 256) 0 ['conv4_block1_0_bn[0][0]']
n)
conv4_block1_1_conv (Conv2D) (None, 14, 14, 128) 32768 ['conv4_block1_0_relu[0][0]']
conv4_block1_1_bn (BatchNormal (None, 14, 14, 128) 512 ['conv4_block1_1_conv[0][0]']
ization)
conv4_block1_1_relu (Activatio (None, 14, 14, 128) 0 ['conv4_block1_1_bn[0][0]']
n)
conv4_block1_2_conv (Conv2D) (None, 14, 14, 32) 36864 ['conv4_block1_1_relu[0][0]']
conv4_block1_concat (Concatena (None, 14, 14, 288) 0 ['pool3_pool[0][0]',
te) 'conv4_block1_2_conv[0][0]']
conv4_block2_0_bn (BatchNormal (None, 14, 14, 288) 1152 ['conv4_block1_concat[0][0]']
ization)
conv4_block2_0_relu (Activatio (None, 14, 14, 288) 0 ['conv4_block2_0_bn[0][0]']
n)
conv4_block2_1_conv (Conv2D) (None, 14, 14, 128) 36864 ['conv4_block2_0_relu[0][0]']
conv4_block2_1_bn (BatchNormal (None, 14, 14, 128) 512 ['conv4_block2_1_conv[0][0]']
ization)
conv4_block2_1_relu (Activatio (None, 14, 14, 128) 0 ['conv4_block2_1_bn[0][0]']
n)
conv4_block2_2_conv (Conv2D) (None, 14, 14, 32) 36864 ['conv4_block2_1_relu[0][0]']
conv4_block2_concat (Concatena (None, 14, 14, 320) 0 ['conv4_block1_concat[0][0]',
te) 'conv4_block2_2_conv[0][0]']
conv4_block3_0_bn (BatchNormal (None, 14, 14, 320) 1280 ['conv4_block2_concat[0][0]']
ization)
conv4_block3_0_relu (Activatio (None, 14, 14, 320) 0 ['conv4_block3_0_bn[0][0]']
n)
conv4_block3_1_conv (Conv2D) (None, 14, 14, 128) 40960 ['conv4_block3_0_relu[0][0]']
conv4_block3_1_bn (BatchNormal (None, 14, 14, 128) 512 ['conv4_block3_1_conv[0][0]']
ization)
conv4_block3_1_relu (Activatio (None, 14, 14, 128) 0 ['conv4_block3_1_bn[0][0]']
n)
conv4_block3_2_conv (Conv2D) (None, 14, 14, 32) 36864 ['conv4_block3_1_relu[0][0]']
conv4_block3_concat (Concatena (None, 14, 14, 352) 0 ['conv4_block2_concat[0][0]',
te) 'conv4_block3_2_conv[0][0]']
conv4_block4_0_bn (BatchNormal (None, 14, 14, 352) 1408 ['conv4_block3_concat[0][0]']
ization)
conv4_block4_0_relu (Activatio (None, 14, 14, 352) 0 ['conv4_block4_0_bn[0][0]']
n)
conv4_block4_1_conv (Conv2D) (None, 14, 14, 128) 45056 ['conv4_block4_0_relu[0][0]']
conv4_block4_1_bn (BatchNormal (None, 14, 14, 128) 512 ['conv4_block4_1_conv[0][0]']
ization)
conv4_block4_1_relu (Activatio (None, 14, 14, 128) 0 ['conv4_block4_1_bn[0][0]']
n)
conv4_block4_2_conv (Conv2D) (None, 14, 14, 32) 36864 ['conv4_block4_1_relu[0][0]']
conv4_block4_concat (Concatena (None, 14, 14, 384) 0 ['conv4_block3_concat[0][0]',
te) 'conv4_block4_2_conv[0][0]']
conv4_block5_0_bn (BatchNormal (None, 14, 14, 384) 1536 ['conv4_block4_concat[0][0]']
ization)
conv4_block5_0_relu (Activatio (None, 14, 14, 384) 0 ['conv4_block5_0_bn[0][0]']
n)
conv4_block5_1_conv (Conv2D) (None, 14, 14, 128) 49152 ['conv4_block5_0_relu[0][0]']
conv4_block5_1_bn (BatchNormal (None, 14, 14, 128) 512 ['conv4_block5_1_conv[0][0]']
ization)
conv4_block5_1_relu (Activatio (None, 14, 14, 128) 0 ['conv4_block5_1_bn[0][0]']
n)
conv4_block5_2_conv (Conv2D) (None, 14, 14, 32) 36864 ['conv4_block5_1_relu[0][0]']
conv4_block5_concat (Concatena (None, 14, 14, 416) 0 ['conv4_block4_concat[0][0]',
te) 'conv4_block5_2_conv[0][0]']
conv4_block6_0_bn (BatchNormal (None, 14, 14, 416) 1664 ['conv4_block5_concat[0][0]']
ization)
conv4_block6_0_relu (Activatio (None, 14, 14, 416) 0 ['conv4_block6_0_bn[0][0]']
n)
conv4_block6_1_conv (Conv2D) (None, 14, 14, 128) 53248 ['conv4_block6_0_relu[0][0]']
conv4_block6_1_bn (BatchNormal (None, 14, 14, 128) 512 ['conv4_block6_1_conv[0][0]']
ization)
conv4_block6_1_relu (Activatio (None, 14, 14, 128) 0 ['conv4_block6_1_bn[0][0]']
n)
conv4_block6_2_conv (Conv2D) (None, 14, 14, 32) 36864 ['conv4_block6_1_relu[0][0]']
conv4_block6_concat (Concatena (None, 14, 14, 448) 0 ['conv4_block5_concat[0][0]',
te) 'conv4_block6_2_conv[0][0]']
conv4_block7_0_bn (BatchNormal (None, 14, 14, 448) 1792 ['conv4_block6_concat[0][0]']
ization)
conv4_block7_0_relu (Activatio (None, 14, 14, 448) 0 ['conv4_block7_0_bn[0][0]']
n)
conv4_block7_1_conv (Conv2D) (None, 14, 14, 128) 57344 ['conv4_block7_0_relu[0][0]']
conv4_block7_1_bn (BatchNormal (None, 14, 14, 128) 512 ['conv4_block7_1_conv[0][0]']
ization)
conv4_block7_1_relu (Activatio (None, 14, 14, 128) 0 ['conv4_block7_1_bn[0][0]']
n)
conv4_block7_2_conv (Conv2D) (None, 14, 14, 32) 36864 ['conv4_block7_1_relu[0][0]']
conv4_block7_concat (Concatena (None, 14, 14, 480) 0 ['conv4_block6_concat[0][0]',
te) 'conv4_block7_2_conv[0][0]']
conv4_block8_0_bn (BatchNormal (None, 14, 14, 480) 1920 ['conv4_block7_concat[0][0]']
ization)
conv4_block8_0_relu (Activatio (None, 14, 14, 480) 0 ['conv4_block8_0_bn[0][0]']
n)
conv4_block8_1_conv (Conv2D) (None, 14, 14, 128) 61440 ['conv4_block8_0_relu[0][0]']
conv4_block8_1_bn (BatchNormal (None, 14, 14, 128) 512 ['conv4_block8_1_conv[0][0]']
ization)
conv4_block8_1_relu (Activatio (None, 14, 14, 128) 0 ['conv4_block8_1_bn[0][0]']
n)
conv4_block8_2_conv (Conv2D) (None, 14, 14, 32) 36864 ['conv4_block8_1_relu[0][0]']
conv4_block8_concat (Concatena (None, 14, 14, 512) 0 ['conv4_block7_concat[0][0]',
te) 'conv4_block8_2_conv[0][0]']
conv4_block9_0_bn (BatchNormal (None, 14, 14, 512) 2048 ['conv4_block8_concat[0][0]']
ization)
conv4_block9_0_relu (Activatio (None, 14, 14, 512) 0 ['conv4_block9_0_bn[0][0]']
n)
conv4_block9_1_conv (Conv2D) (None, 14, 14, 128) 65536 ['conv4_block9_0_relu[0][0]']
conv4_block9_1_bn (BatchNormal (None, 14, 14, 128) 512 ['conv4_block9_1_conv[0][0]']
ization)
conv4_block9_1_relu (Activatio (None, 14, 14, 128) 0 ['conv4_block9_1_bn[0][0]']
n)
conv4_block9_2_conv (Conv2D) (None, 14, 14, 32) 36864 ['conv4_block9_1_relu[0][0]']
conv4_block9_concat (Concatena (None, 14, 14, 544) 0 ['conv4_block8_concat[0][0]',
te) 'conv4_block9_2_conv[0][0]']
conv4_block10_0_bn (BatchNorma (None, 14, 14, 544) 2176 ['conv4_block9_concat[0][0]']
lization)
conv4_block10_0_relu (Activati (None, 14, 14, 544) 0 ['conv4_block10_0_bn[0][0]']
on)
conv4_block10_1_conv (Conv2D) (None, 14, 14, 128) 69632 ['conv4_block10_0_relu[0][0]']
conv4_block10_1_bn (BatchNorma (None, 14, 14, 128) 512 ['conv4_block10_1_conv[0][0]']
lization)
conv4_block10_1_relu (Activati (None, 14, 14, 128) 0 ['conv4_block10_1_bn[0][0]']
on)
conv4_block10_2_conv (Conv2D) (None, 14, 14, 32) 36864 ['conv4_block10_1_relu[0][0]']
conv4_block10_concat (Concaten (None, 14, 14, 576) 0 ['conv4_block9_concat[0][0]',
ate) 'conv4_block10_2_conv[0][0]']
conv4_block11_0_bn (BatchNorma (None, 14, 14, 576) 2304 ['conv4_block10_concat[0][0]']
lization)
conv4_block11_0_relu (Activati (None, 14, 14, 576) 0 ['conv4_block11_0_bn[0][0]']
on)
conv4_block11_1_conv (Conv2D) (None, 14, 14, 128) 73728 ['conv4_block11_0_relu[0][0]']
conv4_block11_1_bn (BatchNorma (None, 14, 14, 128) 512 ['conv4_block11_1_conv[0][0]']
lization)
conv4_block11_1_relu (Activati (None, 14, 14, 128) 0 ['conv4_block11_1_bn[0][0]']
on)
conv4_block11_2_conv (Conv2D) (None, 14, 14, 32) 36864 ['conv4_block11_1_relu[0][0]']
conv4_block11_concat (Concaten (None, 14, 14, 608) 0 ['conv4_block10_concat[0][0]',
ate) 'conv4_block11_2_conv[0][0]']
conv4_block12_0_bn (BatchNorma (None, 14, 14, 608) 2432 ['conv4_block11_concat[0][0]']
lization)
conv4_block12_0_relu (Activati (None, 14, 14, 608) 0 ['conv4_block12_0_bn[0][0]']
on)
conv4_block12_1_conv (Conv2D) (None, 14, 14, 128) 77824 ['conv4_block12_0_relu[0][0]']
conv4_block12_1_bn (BatchNorma (None, 14, 14, 128) 512 ['conv4_block12_1_conv[0][0]']
lization)
conv4_block12_1_relu (Activati (None, 14, 14, 128) 0 ['conv4_block12_1_bn[0][0]']
on)
conv4_block12_2_conv (Conv2D) (None, 14, 14, 32) 36864 ['conv4_block12_1_relu[0][0]']
conv4_block12_concat (Concaten (None, 14, 14, 640) 0 ['conv4_block11_concat[0][0]',
ate) 'conv4_block12_2_conv[0][0]']
conv4_block13_0_bn (BatchNorma (None, 14, 14, 640) 2560 ['conv4_block12_concat[0][0]']
lization)
conv4_block13_0_relu (Activati (None, 14, 14, 640) 0 ['conv4_block13_0_bn[0][0]']
on)
conv4_block13_1_conv (Conv2D) (None, 14, 14, 128) 81920 ['conv4_block13_0_relu[0][0]']
conv4_block13_1_bn (BatchNorma (None, 14, 14, 128) 512 ['conv4_block13_1_conv[0][0]']
lization)
conv4_block13_1_relu (Activati (None, 14, 14, 128) 0 ['conv4_block13_1_bn[0][0]']
on)
conv4_block13_2_conv (Conv2D) (None, 14, 14, 32) 36864 ['conv4_block13_1_relu[0][0]']
conv4_block13_concat (Concaten (None, 14, 14, 672) 0 ['conv4_block12_concat[0][0]',
ate) 'conv4_block13_2_conv[0][0]']
conv4_block14_0_bn (BatchNorma (None, 14, 14, 672) 2688 ['conv4_block13_concat[0][0]']
lization)
conv4_block14_0_relu (Activati (None, 14, 14, 672) 0 ['conv4_block14_0_bn[0][0]']
on)
conv4_block14_1_conv (Conv2D) (None, 14, 14, 128) 86016 ['conv4_block14_0_relu[0][0]']
conv4_block14_1_bn (BatchNorma (None, 14, 14, 128) 512 ['conv4_block14_1_conv[0][0]']
lization)
conv4_block14_1_relu (Activati (None, 14, 14, 128) 0 ['conv4_block14_1_bn[0][0]']
on)
conv4_block14_2_conv (Conv2D) (None, 14, 14, 32) 36864 ['conv4_block14_1_relu[0][0]']
conv4_block14_concat (Concaten (None, 14, 14, 704) 0 ['conv4_block13_concat[0][0]',
ate) 'conv4_block14_2_conv[0][0]']
conv4_block15_0_bn (BatchNorma (None, 14, 14, 704) 2816 ['conv4_block14_concat[0][0]']
lization)
conv4_block15_0_relu (Activati (None, 14, 14, 704) 0 ['conv4_block15_0_bn[0][0]']
on)
conv4_block15_1_conv (Conv2D) (None, 14, 14, 128) 90112 ['conv4_block15_0_relu[0][0]']
conv4_block15_1_bn (BatchNorma (None, 14, 14, 128) 512 ['conv4_block15_1_conv[0][0]']
lization)
conv4_block15_1_relu (Activati (None, 14, 14, 128) 0 ['conv4_block15_1_bn[0][0]']
on)
conv4_block15_2_conv (Conv2D) (None, 14, 14, 32) 36864 ['conv4_block15_1_relu[0][0]']
conv4_block15_concat (Concaten (None, 14, 14, 736) 0 ['conv4_block14_concat[0][0]',
ate) 'conv4_block15_2_conv[0][0]']
conv4_block16_0_bn (BatchNorma (None, 14, 14, 736) 2944 ['conv4_block15_concat[0][0]']
lization)
conv4_block16_0_relu (Activati (None, 14, 14, 736) 0 ['conv4_block16_0_bn[0][0]']
on)
conv4_block16_1_conv (Conv2D) (None, 14, 14, 128) 94208 ['conv4_block16_0_relu[0][0]']
conv4_block16_1_bn (BatchNorma (None, 14, 14, 128) 512 ['conv4_block16_1_conv[0][0]']
lization)
conv4_block16_1_relu (Activati (None, 14, 14, 128) 0 ['conv4_block16_1_bn[0][0]']
on)
conv4_block16_2_conv (Conv2D) (None, 14, 14, 32) 36864 ['conv4_block16_1_relu[0][0]']
conv4_block16_concat (Concaten (None, 14, 14, 768) 0 ['conv4_block15_concat[0][0]',
ate) 'conv4_block16_2_conv[0][0]']
conv4_block17_0_bn (BatchNorma (None, 14, 14, 768) 3072 ['conv4_block16_concat[0][0]']
lization)
conv4_block17_0_relu (Activati (None, 14, 14, 768) 0 ['conv4_block17_0_bn[0][0]']
on)
conv4_block17_1_conv (Conv2D) (None, 14, 14, 128) 98304 ['conv4_block17_0_relu[0][0]']
conv4_block17_1_bn (BatchNorma (None, 14, 14, 128) 512 ['conv4_block17_1_conv[0][0]']
lization)
conv4_block17_1_relu (Activati (None, 14, 14, 128) 0 ['conv4_block17_1_bn[0][0]']
on)
conv4_block17_2_conv (Conv2D) (None, 14, 14, 32) 36864 ['conv4_block17_1_relu[0][0]']
conv4_block17_concat (Concaten (None, 14, 14, 800) 0 ['conv4_block16_concat[0][0]',
ate) 'conv4_block17_2_conv[0][0]']
conv4_block18_0_bn (BatchNorma (None, 14, 14, 800) 3200 ['conv4_block17_concat[0][0]']
lization)
conv4_block18_0_relu (Activati (None, 14, 14, 800) 0 ['conv4_block18_0_bn[0][0]']
on)
conv4_block18_1_conv (Conv2D) (None, 14, 14, 128) 102400 ['conv4_block18_0_relu[0][0]']
conv4_block18_1_bn (BatchNorma (None, 14, 14, 128) 512 ['conv4_block18_1_conv[0][0]']
lization)
conv4_block18_1_relu (Activati (None, 14, 14, 128) 0 ['conv4_block18_1_bn[0][0]']
on)
conv4_block18_2_conv (Conv2D) (None, 14, 14, 32) 36864 ['conv4_block18_1_relu[0][0]']
conv4_block18_concat (Concaten (None, 14, 14, 832) 0 ['conv4_block17_concat[0][0]',
ate) 'conv4_block18_2_conv[0][0]']
conv4_block19_0_bn (BatchNorma (None, 14, 14, 832) 3328 ['conv4_block18_concat[0][0]']
lization)
conv4_block19_0_relu (Activati (None, 14, 14, 832) 0 ['conv4_block19_0_bn[0][0]']
on)
conv4_block19_1_conv (Conv2D) (None, 14, 14, 128) 106496 ['conv4_block19_0_relu[0][0]']
conv4_block19_1_bn (BatchNorma (None, 14, 14, 128) 512 ['conv4_block19_1_conv[0][0]']
lization)
conv4_block19_1_relu (Activati (None, 14, 14, 128) 0 ['conv4_block19_1_bn[0][0]']
on)
conv4_block19_2_conv (Conv2D) (None, 14, 14, 32) 36864 ['conv4_block19_1_relu[0][0]']
conv4_block19_concat (Concaten (None, 14, 14, 864) 0 ['conv4_block18_concat[0][0]',
ate) 'conv4_block19_2_conv[0][0]']
conv4_block20_0_bn (BatchNorma (None, 14, 14, 864) 3456 ['conv4_block19_concat[0][0]']
lization)
conv4_block20_0_relu (Activati (None, 14, 14, 864) 0 ['conv4_block20_0_bn[0][0]']
on)
conv4_block20_1_conv (Conv2D) (None, 14, 14, 128) 110592 ['conv4_block20_0_relu[0][0]']
conv4_block20_1_bn (BatchNorma (None, 14, 14, 128) 512 ['conv4_block20_1_conv[0][0]']
lization)
conv4_block20_1_relu (Activati (None, 14, 14, 128) 0 ['conv4_block20_1_bn[0][0]']
on)
conv4_block20_2_conv (Conv2D) (None, 14, 14, 32) 36864 ['conv4_block20_1_relu[0][0]']
conv4_block20_concat (Concaten (None, 14, 14, 896) 0 ['conv4_block19_concat[0][0]',
ate) 'conv4_block20_2_conv[0][0]']
conv4_block21_0_bn (BatchNorma (None, 14, 14, 896) 3584 ['conv4_block20_concat[0][0]']
lization)
conv4_block21_0_relu (Activati (None, 14, 14, 896) 0 ['conv4_block21_0_bn[0][0]']
on)
conv4_block21_1_conv (Conv2D) (None, 14, 14, 128) 114688 ['conv4_block21_0_relu[0][0]']
conv4_block21_1_bn (BatchNorma (None, 14, 14, 128) 512 ['conv4_block21_1_conv[0][0]']
lization)
conv4_block21_1_relu (Activati (None, 14, 14, 128) 0 ['conv4_block21_1_bn[0][0]']
on)
conv4_block21_2_conv (Conv2D) (None, 14, 14, 32) 36864 ['conv4_block21_1_relu[0][0]']
conv4_block21_concat (Concaten (None, 14, 14, 928) 0 ['conv4_block20_concat[0][0]',
ate) 'conv4_block21_2_conv[0][0]']
conv4_block22_0_bn (BatchNorma (None, 14, 14, 928) 3712 ['conv4_block21_concat[0][0]']
lization)
conv4_block22_0_relu (Activati (None, 14, 14, 928) 0 ['conv4_block22_0_bn[0][0]']
on)
conv4_block22_1_conv (Conv2D) (None, 14, 14, 128) 118784 ['conv4_block22_0_relu[0][0]']
conv4_block22_1_bn (BatchNorma (None, 14, 14, 128) 512 ['conv4_block22_1_conv[0][0]']
lization)
conv4_block22_1_relu (Activati (None, 14, 14, 128) 0 ['conv4_block22_1_bn[0][0]']
on)
conv4_block22_2_conv (Conv2D) (None, 14, 14, 32) 36864 ['conv4_block22_1_relu[0][0]']
conv4_block22_concat (Concaten (None, 14, 14, 960) 0 ['conv4_block21_concat[0][0]',
ate) 'conv4_block22_2_conv[0][0]']
conv4_block23_0_bn (BatchNorma (None, 14, 14, 960) 3840 ['conv4_block22_concat[0][0]']
lization)
conv4_block23_0_relu (Activati (None, 14, 14, 960) 0 ['conv4_block23_0_bn[0][0]']
on)
conv4_block23_1_conv (Conv2D) (None, 14, 14, 128) 122880 ['conv4_block23_0_relu[0][0]']
conv4_block23_1_bn (BatchNorma (None, 14, 14, 128) 512 ['conv4_block23_1_conv[0][0]']
lization)
conv4_block23_1_relu (Activati (None, 14, 14, 128) 0 ['conv4_block23_1_bn[0][0]']
on)
conv4_block23_2_conv (Conv2D) (None, 14, 14, 32) 36864 ['conv4_block23_1_relu[0][0]']
conv4_block23_concat (Concaten (None, 14, 14, 992) 0 ['conv4_block22_concat[0][0]',
ate) 'conv4_block23_2_conv[0][0]']
conv4_block24_0_bn (BatchNorma (None, 14, 14, 992) 3968 ['conv4_block23_concat[0][0]']
lization)
conv4_block24_0_relu (Activati (None, 14, 14, 992) 0 ['conv4_block24_0_bn[0][0]']
on)
conv4_block24_1_conv (Conv2D) (None, 14, 14, 128) 126976 ['conv4_block24_0_relu[0][0]']
conv4_block24_1_bn (BatchNorma (None, 14, 14, 128) 512 ['conv4_block24_1_conv[0][0]']
lization)
conv4_block24_1_relu (Activati (None, 14, 14, 128) 0 ['conv4_block24_1_bn[0][0]']
on)
conv4_block24_2_conv (Conv2D) (None, 14, 14, 32) 36864 ['conv4_block24_1_relu[0][0]']
conv4_block24_concat (Concaten (None, 14, 14, 1024 0 ['conv4_block23_concat[0][0]',
ate) ) 'conv4_block24_2_conv[0][0]']
pool4_bn (BatchNormalization) (None, 14, 14, 1024 4096 ['conv4_block24_concat[0][0]']
)
pool4_relu (Activation) (None, 14, 14, 1024 0 ['pool4_bn[0][0]']
)
pool4_conv (Conv2D) (None, 14, 14, 512) 524288 ['pool4_relu[0][0]']
pool4_pool (AveragePooling2D) (None, 7, 7, 512) 0 ['pool4_conv[0][0]']
conv5_block1_0_bn (BatchNormal (None, 7, 7, 512) 2048 ['pool4_pool[0][0]']
ization)
conv5_block1_0_relu (Activatio (None, 7, 7, 512) 0 ['conv5_block1_0_bn[0][0]']
n)
conv5_block1_1_conv (Conv2D) (None, 7, 7, 128) 65536 ['conv5_block1_0_relu[0][0]']
conv5_block1_1_bn (BatchNormal (None, 7, 7, 128) 512 ['conv5_block1_1_conv[0][0]']
ization)
conv5_block1_1_relu (Activatio (None, 7, 7, 128) 0 ['conv5_block1_1_bn[0][0]']
n)
conv5_block1_2_conv (Conv2D) (None, 7, 7, 32) 36864 ['conv5_block1_1_relu[0][0]']
conv5_block1_concat (Concatena (None, 7, 7, 544) 0 ['pool4_pool[0][0]',
te) 'conv5_block1_2_conv[0][0]']
conv5_block2_0_bn (BatchNormal (None, 7, 7, 544) 2176 ['conv5_block1_concat[0][0]']
ization)
conv5_block2_0_relu (Activatio (None, 7, 7, 544) 0 ['conv5_block2_0_bn[0][0]']
n)
conv5_block2_1_conv (Conv2D) (None, 7, 7, 128) 69632 ['conv5_block2_0_relu[0][0]']
conv5_block2_1_bn (BatchNormal (None, 7, 7, 128) 512 ['conv5_block2_1_conv[0][0]']
ization)
conv5_block2_1_relu (Activatio (None, 7, 7, 128) 0 ['conv5_block2_1_bn[0][0]']
n)
conv5_block2_2_conv (Conv2D) (None, 7, 7, 32) 36864 ['conv5_block2_1_relu[0][0]']
conv5_block2_concat (Concatena (None, 7, 7, 576) 0 ['conv5_block1_concat[0][0]',
te) 'conv5_block2_2_conv[0][0]']
conv5_block3_0_bn (BatchNormal (None, 7, 7, 576) 2304 ['conv5_block2_concat[0][0]']
ization)
conv5_block3_0_relu (Activatio (None, 7, 7, 576) 0 ['conv5_block3_0_bn[0][0]']
n)
conv5_block3_1_conv (Conv2D) (None, 7, 7, 128) 73728 ['conv5_block3_0_relu[0][0]']
conv5_block3_1_bn (BatchNormal (None, 7, 7, 128) 512 ['conv5_block3_1_conv[0][0]']
ization)
conv5_block3_1_relu (Activatio (None, 7, 7, 128) 0 ['conv5_block3_1_bn[0][0]']
n)
conv5_block3_2_conv (Conv2D) (None, 7, 7, 32) 36864 ['conv5_block3_1_relu[0][0]']
conv5_block3_concat (Concatena (None, 7, 7, 608) 0 ['conv5_block2_concat[0][0]',
te) 'conv5_block3_2_conv[0][0]']
conv5_block4_0_bn (BatchNormal (None, 7, 7, 608) 2432 ['conv5_block3_concat[0][0]']
ization)
conv5_block4_0_relu (Activatio (None, 7, 7, 608) 0 ['conv5_block4_0_bn[0][0]']
n)
conv5_block4_1_conv (Conv2D) (None, 7, 7, 128) 77824 ['conv5_block4_0_relu[0][0]']
conv5_block4_1_bn (BatchNormal (None, 7, 7, 128) 512 ['conv5_block4_1_conv[0][0]']
ization)
conv5_block4_1_relu (Activatio (None, 7, 7, 128) 0 ['conv5_block4_1_bn[0][0]']
n)
conv5_block4_2_conv (Conv2D) (None, 7, 7, 32) 36864 ['conv5_block4_1_relu[0][0]']
conv5_block4_concat (Concatena (None, 7, 7, 640) 0 ['conv5_block3_concat[0][0]',
te) 'conv5_block4_2_conv[0][0]']
conv5_block5_0_bn (BatchNormal (None, 7, 7, 640) 2560 ['conv5_block4_concat[0][0]']
ization)
conv5_block5_0_relu (Activatio (None, 7, 7, 640) 0 ['conv5_block5_0_bn[0][0]']
n)
conv5_block5_1_conv (Conv2D) (None, 7, 7, 128) 81920 ['conv5_block5_0_relu[0][0]']
conv5_block5_1_bn (BatchNormal (None, 7, 7, 128) 512 ['conv5_block5_1_conv[0][0]']
ization)
conv5_block5_1_relu (Activatio (None, 7, 7, 128) 0 ['conv5_block5_1_bn[0][0]']
n)
conv5_block5_2_conv (Conv2D) (None, 7, 7, 32) 36864 ['conv5_block5_1_relu[0][0]']
conv5_block5_concat (Concatena (None, 7, 7, 672) 0 ['conv5_block4_concat[0][0]',
te) 'conv5_block5_2_conv[0][0]']
conv5_block6_0_bn (BatchNormal (None, 7, 7, 672) 2688 ['conv5_block5_concat[0][0]']
ization)
conv5_block6_0_relu (Activatio (None, 7, 7, 672) 0 ['conv5_block6_0_bn[0][0]']
n)
conv5_block6_1_conv (Conv2D) (None, 7, 7, 128) 86016 ['conv5_block6_0_relu[0][0]']
conv5_block6_1_bn (BatchNormal (None, 7, 7, 128) 512 ['conv5_block6_1_conv[0][0]']
ization)
conv5_block6_1_relu (Activatio (None, 7, 7, 128) 0 ['conv5_block6_1_bn[0][0]']
n)
conv5_block6_2_conv (Conv2D) (None, 7, 7, 32) 36864 ['conv5_block6_1_relu[0][0]']
conv5_block6_concat (Concatena (None, 7, 7, 704) 0 ['conv5_block5_concat[0][0]',
te) 'conv5_block6_2_conv[0][0]']
conv5_block7_0_bn (BatchNormal (None, 7, 7, 704) 2816 ['conv5_block6_concat[0][0]']
ization)
conv5_block7_0_relu (Activatio (None, 7, 7, 704) 0 ['conv5_block7_0_bn[0][0]']
n)
conv5_block7_1_conv (Conv2D) (None, 7, 7, 128) 90112 ['conv5_block7_0_relu[0][0]']
conv5_block7_1_bn (BatchNormal (None, 7, 7, 128) 512 ['conv5_block7_1_conv[0][0]']
ization)
conv5_block7_1_relu (Activatio (None, 7, 7, 128) 0 ['conv5_block7_1_bn[0][0]']
n)
conv5_block7_2_conv (Conv2D) (None, 7, 7, 32) 36864 ['conv5_block7_1_relu[0][0]']
conv5_block7_concat (Concatena (None, 7, 7, 736) 0 ['conv5_block6_concat[0][0]',
te) 'conv5_block7_2_conv[0][0]']
conv5_block8_0_bn (BatchNormal (None, 7, 7, 736) 2944 ['conv5_block7_concat[0][0]']
ization)
conv5_block8_0_relu (Activatio (None, 7, 7, 736) 0 ['conv5_block8_0_bn[0][0]']
n)
conv5_block8_1_conv (Conv2D) (None, 7, 7, 128) 94208 ['conv5_block8_0_relu[0][0]']
conv5_block8_1_bn (BatchNormal (None, 7, 7, 128) 512 ['conv5_block8_1_conv[0][0]']
ization)
conv5_block8_1_relu (Activatio (None, 7, 7, 128) 0 ['conv5_block8_1_bn[0][0]']
n)
conv5_block8_2_conv (Conv2D) (None, 7, 7, 32) 36864 ['conv5_block8_1_relu[0][0]']
conv5_block8_concat (Concatena (None, 7, 7, 768) 0 ['conv5_block7_concat[0][0]',
te) 'conv5_block8_2_conv[0][0]']
conv5_block9_0_bn (BatchNormal (None, 7, 7, 768) 3072 ['conv5_block8_concat[0][0]']
ization)
conv5_block9_0_relu (Activatio (None, 7, 7, 768) 0 ['conv5_block9_0_bn[0][0]']
n)
conv5_block9_1_conv (Conv2D) (None, 7, 7, 128) 98304 ['conv5_block9_0_relu[0][0]']
conv5_block9_1_bn (BatchNormal (None, 7, 7, 128) 512 ['conv5_block9_1_conv[0][0]']
ization)
conv5_block9_1_relu (Activatio (None, 7, 7, 128) 0 ['conv5_block9_1_bn[0][0]']
n)
conv5_block9_2_conv (Conv2D) (None, 7, 7, 32) 36864 ['conv5_block9_1_relu[0][0]']
conv5_block9_concat (Concatena (None, 7, 7, 800) 0 ['conv5_block8_concat[0][0]',
te) 'conv5_block9_2_conv[0][0]']
conv5_block10_0_bn (BatchNorma (None, 7, 7, 800) 3200 ['conv5_block9_concat[0][0]']
lization)
conv5_block10_0_relu (Activati (None, 7, 7, 800) 0 ['conv5_block10_0_bn[0][0]']
on)
conv5_block10_1_conv (Conv2D) (None, 7, 7, 128) 102400 ['conv5_block10_0_relu[0][0]']
conv5_block10_1_bn (BatchNorma (None, 7, 7, 128) 512 ['conv5_block10_1_conv[0][0]']
lization)
conv5_block10_1_relu (Activati (None, 7, 7, 128) 0 ['conv5_block10_1_bn[0][0]']
on)
conv5_block10_2_conv (Conv2D) (None, 7, 7, 32) 36864 ['conv5_block10_1_relu[0][0]']
conv5_block10_concat (Concaten (None, 7, 7, 832) 0 ['conv5_block9_concat[0][0]',
ate) 'conv5_block10_2_conv[0][0]']
conv5_block11_0_bn (BatchNorma (None, 7, 7, 832) 3328 ['conv5_block10_concat[0][0]']
lization)
conv5_block11_0_relu (Activati (None, 7, 7, 832) 0 ['conv5_block11_0_bn[0][0]']
on)
conv5_block11_1_conv (Conv2D) (None, 7, 7, 128) 106496 ['conv5_block11_0_relu[0][0]']
conv5_block11_1_bn (BatchNorma (None, 7, 7, 128) 512 ['conv5_block11_1_conv[0][0]']
lization)
conv5_block11_1_relu (Activati (None, 7, 7, 128) 0 ['conv5_block11_1_bn[0][0]']
on)
conv5_block11_2_conv (Conv2D) (None, 7, 7, 32) 36864 ['conv5_block11_1_relu[0][0]']
conv5_block11_concat (Concaten (None, 7, 7, 864) 0 ['conv5_block10_concat[0][0]',
ate) 'conv5_block11_2_conv[0][0]']
conv5_block12_0_bn (BatchNorma (None, 7, 7, 864) 3456 ['conv5_block11_concat[0][0]']
lization)
conv5_block12_0_relu (Activati (None, 7, 7, 864) 0 ['conv5_block12_0_bn[0][0]']
on)
conv5_block12_1_conv (Conv2D) (None, 7, 7, 128) 110592 ['conv5_block12_0_relu[0][0]']
conv5_block12_1_bn (BatchNorma (None, 7, 7, 128) 512 ['conv5_block12_1_conv[0][0]']
lization)
conv5_block12_1_relu (Activati (None, 7, 7, 128) 0 ['conv5_block12_1_bn[0][0]']
on)
conv5_block12_2_conv (Conv2D) (None, 7, 7, 32) 36864 ['conv5_block12_1_relu[0][0]']
conv5_block12_concat (Concaten (None, 7, 7, 896) 0 ['conv5_block11_concat[0][0]',
ate) 'conv5_block12_2_conv[0][0]']
conv5_block13_0_bn (BatchNorma (None, 7, 7, 896) 3584 ['conv5_block12_concat[0][0]']
lization)
conv5_block13_0_relu (Activati (None, 7, 7, 896) 0 ['conv5_block13_0_bn[0][0]']
on)
conv5_block13_1_conv (Conv2D) (None, 7, 7, 128) 114688 ['conv5_block13_0_relu[0][0]']
conv5_block13_1_bn (BatchNorma (None, 7, 7, 128) 512 ['conv5_block13_1_conv[0][0]']
lization)
conv5_block13_1_relu (Activati (None, 7, 7, 128) 0 ['conv5_block13_1_bn[0][0]']
on)
conv5_block13_2_conv (Conv2D) (None, 7, 7, 32) 36864 ['conv5_block13_1_relu[0][0]']
conv5_block13_concat (Concaten (None, 7, 7, 928) 0 ['conv5_block12_concat[0][0]',
ate) 'conv5_block13_2_conv[0][0]']
conv5_block14_0_bn (BatchNorma (None, 7, 7, 928) 3712 ['conv5_block13_concat[0][0]']
lization)
conv5_block14_0_relu (Activati (None, 7, 7, 928) 0 ['conv5_block14_0_bn[0][0]']
on)
conv5_block14_1_conv (Conv2D) (None, 7, 7, 128) 118784 ['conv5_block14_0_relu[0][0]']
conv5_block14_1_bn (BatchNorma (None, 7, 7, 128) 512 ['conv5_block14_1_conv[0][0]']
lization)
conv5_block14_1_relu (Activati (None, 7, 7, 128) 0 ['conv5_block14_1_bn[0][0]']
on)
conv5_block14_2_conv (Conv2D) (None, 7, 7, 32) 36864 ['conv5_block14_1_relu[0][0]']
conv5_block14_concat (Concaten (None, 7, 7, 960) 0 ['conv5_block13_concat[0][0]',
ate) 'conv5_block14_2_conv[0][0]']
conv5_block15_0_bn (BatchNorma (None, 7, 7, 960) 3840 ['conv5_block14_concat[0][0]']
lization)
conv5_block15_0_relu (Activati (None, 7, 7, 960) 0 ['conv5_block15_0_bn[0][0]']
on)
conv5_block15_1_conv (Conv2D) (None, 7, 7, 128) 122880 ['conv5_block15_0_relu[0][0]']
conv5_block15_1_bn (BatchNorma (None, 7, 7, 128) 512 ['conv5_block15_1_conv[0][0]']
lization)
conv5_block15_1_relu (Activati (None, 7, 7, 128) 0 ['conv5_block15_1_bn[0][0]']
on)
conv5_block15_2_conv (Conv2D) (None, 7, 7, 32) 36864 ['conv5_block15_1_relu[0][0]']
conv5_block15_concat (Concaten (None, 7, 7, 992) 0 ['conv5_block14_concat[0][0]',
ate) 'conv5_block15_2_conv[0][0]']
conv5_block16_0_bn (BatchNorma (None, 7, 7, 992) 3968 ['conv5_block15_concat[0][0]']
lization)
conv5_block16_0_relu (Activati (None, 7, 7, 992) 0 ['conv5_block16_0_bn[0][0]']
on)
conv5_block16_1_conv (Conv2D) (None, 7, 7, 128) 126976 ['conv5_block16_0_relu[0][0]']
conv5_block16_1_bn (BatchNorma (None, 7, 7, 128) 512 ['conv5_block16_1_conv[0][0]']
lization)
conv5_block16_1_relu (Activati (None, 7, 7, 128) 0 ['conv5_block16_1_bn[0][0]']
on)
conv5_block16_2_conv (Conv2D) (None, 7, 7, 32) 36864 ['conv5_block16_1_relu[0][0]']
conv5_block16_concat (Concaten (None, 7, 7, 1024) 0 ['conv5_block15_concat[0][0]',
ate) 'conv5_block16_2_conv[0][0]']
bn (BatchNormalization) (None, 7, 7, 1024) 4096 ['conv5_block16_concat[0][0]']
relu (Activation) (None, 7, 7, 1024) 0 ['bn[0][0]']
avg_pool (GlobalAveragePooling (None, 1024) 0 ['relu[0][0]']
2D)
dense (Dense) (None, 256) 262400 ['avg_pool[0][0]']
batch_normalization (BatchNorm (None, 256) 1024 ['dense[0][0]']
alization)
tf.nn.relu (TFOpLambda) (None, 256) 0 ['batch_normalization[0][0]']
dropout (Dropout) (None, 256) 0 ['tf.nn.relu[0][0]']
dense_1 (Dense) (None, 128) 32896 ['dropout[0][0]']
batch_normalization_1 (BatchNo (None, 128) 512 ['dense_1[0][0]']
rmalization)
tf.nn.relu_1 (TFOpLambda) (None, 128) 0 ['batch_normalization_1[0][0]']
predictions (Dense) (None, 2) 258 ['tf.nn.relu_1[0][0]']
==================================================================================================
Total params: 7,334,594
Trainable params: 7,250,178
Non-trainable params: 84,416
__________________________________________________________________________________________________
checkpoint = ModelCheckpoint(filepath= 'Densenet_tuned.h5', save_best_only=True, save_weights_only=False)
lr_reduce = ReduceLROnPlateau(monitor='val_loss', factor=0.99, patience=2, verbose=2, mode='min')
early_stop = EarlyStopping(monitor='val_loss', min_delta=0, patience=2, mode='min')
optimizer = Adam(learning_rate=0.0001, beta_1=0.9, beta_2=0.999, epsilon=1e-07)
tf.debugging.set_log_device_placement(True)
try:
with tf.device('/device:XLA_GPU:0'):
tf.keras.backend.clear_session()
new_dense_model.compile(optimizer=optimizer, loss='binary_crossentropy', metrics=['acc'])
except RuntimeError as e:
print(e)
tf.keras.backend.clear_session()
batch_size = 60
nb_epochs = 10
new_dense_hist_1 = new_dense_model.fit(X_train1, y_train1,
batch_size=batch_size,
epochs=nb_epochs,
validation_data=(X_val, y_val),
initial_epoch=0, callbacks=([checkpoint, lr_reduce, early_stop]))
Epoch 1/12 64/64 [==============================] - 224s 3s/step - loss: 0.6741 - acc: 0.6227 - val_loss: 1.2947 - val_acc: 0.5000 - lr: 1.0000e-04 Epoch 2/12 64/64 [==============================] - 250s 4s/step - loss: 0.6351 - acc: 0.6602 - val_loss: 0.9560 - val_acc: 0.4885 - lr: 1.0000e-04 Epoch 3/12 64/64 [==============================] - 235s 4s/step - loss: 0.6230 - acc: 0.6721 - val_loss: 0.7419 - val_acc: 0.5417 - lr: 1.0000e-04 Epoch 4/12 64/64 [==============================] - 228s 4s/step - loss: 0.6200 - acc: 0.6779 - val_loss: 0.7245 - val_acc: 0.5646 - lr: 1.0000e-04 Epoch 5/12 64/64 [==============================] - 238s 4s/step - loss: 0.6095 - acc: 0.6875 - val_loss: 0.7969 - val_acc: 0.5333 - lr: 1.0000e-04 Epoch 6/12 64/64 [==============================] - ETA: 0s - loss: 0.6072 - acc: 0.6854 Epoch 6: ReduceLROnPlateau reducing learning rate to 9.899999749904965e-05. 64/64 [==============================] - 239s 4s/step - loss: 0.6072 - acc: 0.6854 - val_loss: 0.8147 - val_acc: 0.5344 - lr: 1.0000e-04
pkl.dump(new_dense_model, open('new_dense_model_2.pkl', 'wb'))
# Pickling the densenet121 model with ChexNet weights:
# On every run, model metrics are being changed.
# We have already pickled the model, where we got the best accuracy so to avoid re-run of the model.
# Due to resource constraints, we have also executed the DenseNet121 model with ChexNet weights in colab pro using 12GB of GPU
# In Colab Pro, DenseNet121 has given us the best accuracy of more than 80% which was pickled and used in This project
load_options = tf.saved_model.LoadOptions(experimental_io_device='/job:localhost')
new_dense_model_2 = pickle.load(open('new_dense_model_2.pkl', 'rb'))
#new_dense_model_1.predict(X_test1)
Pre_trained_results.columns = ["Model_Name", "Accuracy", "Precision", "Recall", "F1 Score", "ROC"]
Pre_trained_results.style.set_properties(**{'border': '1.3px solid black','color': 'blue'}).hide_index()
| Model_Name | Accuracy | Precision | Recall | F1 Score | ROC |
|---|---|---|---|---|---|
| VGG 16 | 0.712500 | 0.704655 | 0.731667 | 0.717907 | 0.712500 |
| VGG 16 Using IDG | 0.663333 | 0.741379 | 0.501667 | 0.598410 | 0.663333 |
| RESTNET50 | 0.718333 | 0.707937 | 0.743333 | 0.725203 | 0.718333 |
| RESTNET50_IMAGE DATA GENERATOR | 0.715000 | 0.723958 | 0.695000 | 0.709184 | 0.715000 |
| INCEPTION V3 | 0.636667 | 0.652985 | 0.583333 | 0.616197 | 0.636667 |
| INCEPTION V3_IMAGE DATA GENERATOR | 0.585833 | 0.554040 | 0.880000 | 0.679974 | 0.585833 |
| INCEPTION V3 - ADDED LAYERS | 0.688333 | 0.687708 | 0.690000 | 0.688852 | 0.688333 |
| DENSENET121 | 0.730833 | 0.727422 | 0.738333 | 0.732837 | 0.730833 |
train_label_pneumonia = train_label[train_label["Target"] == 1]
train_label_pneumonia = train_label_pneumonia.reset_index(drop = True)
train_label_pneumonia["Full_filename"] = "stage_2_train_images"+"/"+train_label_pneumonia["patientId"]+".dcm"
train_label_pneumonia["Filename"] = train_label_pneumonia["patientId"]+".dcm"
#train_label_pneumonia = train_label_pneumonia.drop(columns = ["patientId"])
train_4500_rows = train_label_pneumonia[:4500]
train_4500_rows.sample(2)
| patientId | x | y | width | height | Target | Full_filename | Filename | |
|---|---|---|---|---|---|---|---|---|
| 223 | 081a1f56-6c47-4b80-988a-3d357c977d65 | 559.0 | 393.0 | 289.0 | 502.0 | 1 | stage_2_train_images/081a1f56-6c47-4b80-988a-3d357c977d65.dcm | 081a1f56-6c47-4b80-988a-3d357c977d65.dcm |
| 544 | 11750ff6-94ea-43d0-bad8-f3c3e5278d7d | 570.0 | 395.0 | 271.0 | 499.0 | 1 | stage_2_train_images/11750ff6-94ea-43d0-bad8-f3c3e5278d7d.dcm | 11750ff6-94ea-43d0-bad8-f3c3e5278d7d.dcm |
train_y_4500_box = train_4500_rows.drop(columns = ["Full_filename","Filename","Target","patientId"])
train_y_4500_box.columns
Index(['x', 'y', 'width', 'height'], dtype='object')
y_4500 = train_y_4500_box.copy()
img_rows=224
img_cols=224
dim = (img_rows, img_cols)
X_opac = []
brk = 0
i = 1 # initialisation
for img in tqdm(train_4500_rows["Full_filename"].values):
ds_3 = dicom.dcmread(img)
img_3 = ds_3.pixel_array
rgb = apply_color_lut(img_3, palette='PET')
train_img = rgb
try:
train_img_resize = cv2.resize(train_img, dim, interpolation=cv2.INTER_LINEAR)
except:
brk +=1
print("breaking out for",img)
break
height_2, width_2, layers = train_img_resize.shape
size=(width_2,height_2)
X_opac.append(train_img_resize)
i += 1
100%|██████████████████████████████████████████████████████████████████████████████| 4500/4500 [01:50<00:00, 40.68it/s]
X_opac = np.asarray(X_opac)
fileName = "X_4500.pkl"
fileObject = open(fileName, 'wb')
pkl.dump(X_opac, fileObject)
fileObject.close()
y_4500.to_csv("y_4500.csv", index = False)
train_X_4500_3 = open("X_4500.pkl", "rb")
X_3_45k = pkl.load(train_X_4500_3)
X_3_45k[0:2]
[array([[[ 0, 4, 3],
[ 0, 6, 5],
[ 0, 5, 4],
...,
[ 0, 5, 4],
[ 0, 5, 4],
[ 0, 5, 4]],
[[ 0, 5, 4],
[ 0, 6, 5],
[ 0, 6, 5],
...,
[ 0, 6, 5],
[ 0, 6, 5],
[ 0, 6, 5]],
[[ 0, 5, 4],
[ 0, 6, 5],
[ 0, 6, 5],
...,
[ 0, 6, 5],
[ 0, 6, 5],
[ 0, 6, 5]],
...,
[[ 0, 5, 4],
[ 0, 8, 7],
[ 0, 10, 9],
...,
[ 0, 12, 11],
[ 0, 12, 11],
[ 0, 12, 11]],
[[ 0, 9, 8],
[ 0, 9, 9],
[ 0, 10, 9],
...,
[ 0, 12, 11],
[ 0, 12, 11],
[ 0, 12, 11]],
[[ 0, 7, 6],
[ 0, 8, 7],
[ 0, 11, 10],
...,
[ 0, 11, 10],
[ 0, 13, 12],
[ 0, 13, 12]]], dtype=uint8),
array([[[ 0, 4, 3],
[ 0, 6, 5],
[ 0, 5, 4],
...,
[ 0, 5, 4],
[ 0, 5, 4],
[ 0, 5, 4]],
[[ 0, 5, 4],
[ 0, 6, 5],
[ 0, 6, 5],
...,
[ 0, 6, 5],
[ 0, 6, 5],
[ 0, 6, 5]],
[[ 0, 5, 4],
[ 0, 6, 5],
[ 0, 6, 5],
...,
[ 0, 6, 5],
[ 0, 6, 5],
[ 0, 6, 5]],
...,
[[ 0, 5, 4],
[ 0, 8, 7],
[ 0, 10, 9],
...,
[ 0, 12, 11],
[ 0, 12, 11],
[ 0, 12, 11]],
[[ 0, 9, 8],
[ 0, 9, 9],
[ 0, 10, 9],
...,
[ 0, 12, 11],
[ 0, 12, 11],
[ 0, 12, 11]],
[[ 0, 7, 6],
[ 0, 8, 7],
[ 0, 11, 10],
...,
[ 0, 11, 10],
[ 0, 13, 12],
[ 0, 13, 12]]], dtype=uint8)]
X_3_45k = np.array(X_3_45k)
X_3_45k.shape
(4500, 224, 224, 3)
y_train = train_y_4500_box.to_numpy()
y_train.shape
(4500, 4)
train_4500_rows.iloc[234,:]
patientId 084d42b9-0a43-490d-9a8d-e4d7545c44f5 x 513.0 y 472.0 width 320.0 height 342.0 Target 1 Full_filename stage_2_train_images/084d42b9-0a43-490d-9a8d-e4d7545c44f5.dcm Filename 084d42b9-0a43-490d-9a8d-e4d7545c44f5.dcm Name: 234, dtype: object
X_train_bb = X_3_45k.copy()
y_train_bb = y_train.copy()
print(X_train_bb.shape)
print(y_train_bb.shape)
(4500, 224, 224, 3) (4500, 4)
rand_image = 234
# Pick a random image to check how it looks
orig_size = 1024
new_size = 224
filename = X_train_bb[rand_image]
# img_rgb = Image.fromarray(filename, "RGB")
fig, ax = plt.subplots(1,figsize=(6,6))
# fig, a = plt.subplots(1,1)
# fig.set_size_inches(5,5)
dim = (new_size, new_size)
img_resize = cv2.resize(filename, dim, interpolation=cv2.INTER_CUBIC)
# a.imshow(img_resize, cmap=plt.cm.bone)
# unscaled = cv2.imread(rgb)
region = y_train_bb[rand_image]
image_height, image_width, _ = filename.shape
print("region",region)
print(image_height)
print(image_width)
x0 = ((region[0] * new_size ) / orig_size)
y0 = ((region[1] * new_size ) / orig_size)
x1 = ((region[2] * new_size ) / orig_size)
y1 = ((region[3] * new_size ) / orig_size)
# x0 = int(region[0] * image_width / IMAGE_SIZE) # Scale the BBox
# y0 = int(region[1] * image_height / IMAGE_SIZE)
# x1 = int((region[0] + region[2]) * image_width / IMAGE_SIZE)
# y1 = int((region[1] + region[3]) * image_height / IMAGE_SIZE)
# Create a Rectangle patch
rect = patches.Rectangle((x0, y0), x1, y1, linewidth=2, edgecolor='r', facecolor='none')
# rect.append(patches.Rectangle((X, Y), width, height,linewidth = 3,edgecolor = 'r',facecolor = 'none'))
print(rect)
# Add the patch to the Axes
ax.imshow(img_resize, cmap=plt.cm.bone)
ax.add_patch(rect)
# ax.set_xlim(-3,3)
# ax.set_ylim(-3,3)
plt.show()
plt.figure().clear()
region [513. 472. 320. 342.] 224 224 Rectangle(xy=(112.219, 103.25), width=70, height=74.8125, angle=0)
print("X_train_bb",X_train_bb.shape)
print("y_train_bb",y_train_bb.shape)
X_train_bb (4500, 224, 224, 3) y_train_bb (4500, 4)
X_train, X_test, y_train, y_test = train_test_split(X_train_bb, y_train_bb, test_size=.20, random_state=10) # 80% Training and 20% Testing
print("X_train shape",X_train.shape)
print("y train shape",y_train.shape)
print("x_test shape",X_test.shape)
print("y_test shape",y_test.shape)
X_train shape (3600, 224, 224, 3) y train shape (3600, 4) x_test shape (900, 224, 224, 3) y_test shape (900, 4)
X_train1, X_val, y_train1, y_val = train_test_split(X_train, y_train, test_size=.20, random_state=20) # 80% Training and 20% Testing
print("X_train1 shape",X_train1.shape)
print("y train1 shape",y_train1.shape)
print("X_val shape",X_val.shape)
print("y_val shape",y_val.shape)
X_train1 shape (2880, 224, 224, 3) y train1 shape (2880, 4) X_val shape (720, 224, 224, 3) y_val shape (720, 4)
ALPHA = 1.0 # Width hyper parameter for MobileNet (0.25, 0.5, 0.75, 1.0). Higher width means more accurate but slower
from tensorflow.keras.layers import Conv2D, Reshape
def create_model_2(trainable=True):
model = MobileNet(input_shape=(128, 128, 3), include_top=False, alpha=ALPHA) # Load pre-trained mobilenet
# Do not include classification (top) layer
# to freeze layers, except the new top layer, of course, which will be added below
for layer in model.layers:
layer.trainable = trainable
# Add new top layer which is a conv layer of the same size as the previous layer so that only 4 coords of BBox can be output
x0 = model.layers[-1].output
# x1 = Conv2D(4, kernel_size=4, kernel_initializer='normal', kernel_regularizer=tf.keras.regularizers.l2(0.01), activity_regularizer=tf.keras.regularizers.l1(0.001), activation="relu", name="coords")(x0)
x1 = Conv2D(4, kernel_size=4, kernel_initializer='normal', kernel_regularizer=tf.keras.regularizers.l2(0.0015), activity_regularizer=tf.keras.regularizers.l1(0.0025), activation="relu", name="coords")(x0)
# x1 = Conv2D(4, kernel_size=4, name="coords")(x0)
# x1 = Conv2D(4, kernel_size=4, kernel_initializer='he_normal', activation="relu", name="coords1")(x1)
# x1 = Conv2D(4, kernel_size=4, kernel_initializer='he_normal', activation="relu", name="coords2")(x1)
# In the line above kernel size should be 3 for img size 96, 4 for img size 128, 5 for img size 160 etc.
x2 = Reshape((4,))(x1) # These are the 4 predicted coordinates of one BBox
return Model(inputs=model.input, outputs=x2)
def IOU_1(y_true, y_pred):
intersections = 0
unions = 0
# set the types so we are sure what type we are using
gt = y_true
pred = y_pred
# Compute interection of predicted (pred) and ground truth (gt) bounding boxes
diff_width = np.minimum(gt[:,0] + gt[:,2], pred[:,0] + pred[:,2]) - np.maximum(gt[:,0], pred[:,0])
diff_height = np.minimum(gt[:,1] + gt[:,3], pred[:,1] + pred[:,3]) - np.maximum(gt[:,1], pred[:,1])
intersection = diff_width * diff_height
# Compute union
area_gt = gt[:,2] * gt[:,3]
area_pred = pred[:,2] * pred[:,3]
union = area_gt + area_pred - intersection
# Compute intersection and union over multiple boxes
for j, _ in enumerate(union):
if union[j] > 0 and intersection[j] > 0 and union[j] >= intersection[j]:
intersections += intersection[j]
unions += union[j]
# Compute IOU. Use epsilon to prevent division by zero
iou = np.round(intersections / (unions + tf.keras.backend.epsilon()), 4)
# This must match the type used in py_func
iou = iou.astype(np.float32)
return iou
def IoU_2(y_true, y_pred):
iou = tf.py_function(IOU_1, [y_true, y_pred], Tout=tf.float32)
return iou
base_model1 = create_model_2(False)
base_model1.summary()
Model: "model"
_________________________________________________________________
Layer (type) Output Shape Param #
=================================================================
input_1 (InputLayer) [(None, 128, 128, 3)] 0
conv1 (Conv2D) (None, 64, 64, 32) 864
conv1_bn (BatchNormalizatio (None, 64, 64, 32) 128
n)
conv1_relu (ReLU) (None, 64, 64, 32) 0
conv_dw_1 (DepthwiseConv2D) (None, 64, 64, 32) 288
conv_dw_1_bn (BatchNormaliz (None, 64, 64, 32) 128
ation)
conv_dw_1_relu (ReLU) (None, 64, 64, 32) 0
conv_pw_1 (Conv2D) (None, 64, 64, 64) 2048
conv_pw_1_bn (BatchNormaliz (None, 64, 64, 64) 256
ation)
conv_pw_1_relu (ReLU) (None, 64, 64, 64) 0
conv_pad_2 (ZeroPadding2D) (None, 65, 65, 64) 0
conv_dw_2 (DepthwiseConv2D) (None, 32, 32, 64) 576
conv_dw_2_bn (BatchNormaliz (None, 32, 32, 64) 256
ation)
conv_dw_2_relu (ReLU) (None, 32, 32, 64) 0
conv_pw_2 (Conv2D) (None, 32, 32, 128) 8192
conv_pw_2_bn (BatchNormaliz (None, 32, 32, 128) 512
ation)
conv_pw_2_relu (ReLU) (None, 32, 32, 128) 0
conv_dw_3 (DepthwiseConv2D) (None, 32, 32, 128) 1152
conv_dw_3_bn (BatchNormaliz (None, 32, 32, 128) 512
ation)
conv_dw_3_relu (ReLU) (None, 32, 32, 128) 0
conv_pw_3 (Conv2D) (None, 32, 32, 128) 16384
conv_pw_3_bn (BatchNormaliz (None, 32, 32, 128) 512
ation)
conv_pw_3_relu (ReLU) (None, 32, 32, 128) 0
conv_pad_4 (ZeroPadding2D) (None, 33, 33, 128) 0
conv_dw_4 (DepthwiseConv2D) (None, 16, 16, 128) 1152
conv_dw_4_bn (BatchNormaliz (None, 16, 16, 128) 512
ation)
conv_dw_4_relu (ReLU) (None, 16, 16, 128) 0
conv_pw_4 (Conv2D) (None, 16, 16, 256) 32768
conv_pw_4_bn (BatchNormaliz (None, 16, 16, 256) 1024
ation)
conv_pw_4_relu (ReLU) (None, 16, 16, 256) 0
conv_dw_5 (DepthwiseConv2D) (None, 16, 16, 256) 2304
conv_dw_5_bn (BatchNormaliz (None, 16, 16, 256) 1024
ation)
conv_dw_5_relu (ReLU) (None, 16, 16, 256) 0
conv_pw_5 (Conv2D) (None, 16, 16, 256) 65536
conv_pw_5_bn (BatchNormaliz (None, 16, 16, 256) 1024
ation)
conv_pw_5_relu (ReLU) (None, 16, 16, 256) 0
conv_pad_6 (ZeroPadding2D) (None, 17, 17, 256) 0
conv_dw_6 (DepthwiseConv2D) (None, 8, 8, 256) 2304
conv_dw_6_bn (BatchNormaliz (None, 8, 8, 256) 1024
ation)
conv_dw_6_relu (ReLU) (None, 8, 8, 256) 0
conv_pw_6 (Conv2D) (None, 8, 8, 512) 131072
conv_pw_6_bn (BatchNormaliz (None, 8, 8, 512) 2048
ation)
conv_pw_6_relu (ReLU) (None, 8, 8, 512) 0
conv_dw_7 (DepthwiseConv2D) (None, 8, 8, 512) 4608
conv_dw_7_bn (BatchNormaliz (None, 8, 8, 512) 2048
ation)
conv_dw_7_relu (ReLU) (None, 8, 8, 512) 0
conv_pw_7 (Conv2D) (None, 8, 8, 512) 262144
conv_pw_7_bn (BatchNormaliz (None, 8, 8, 512) 2048
ation)
conv_pw_7_relu (ReLU) (None, 8, 8, 512) 0
conv_dw_8 (DepthwiseConv2D) (None, 8, 8, 512) 4608
conv_dw_8_bn (BatchNormaliz (None, 8, 8, 512) 2048
ation)
conv_dw_8_relu (ReLU) (None, 8, 8, 512) 0
conv_pw_8 (Conv2D) (None, 8, 8, 512) 262144
conv_pw_8_bn (BatchNormaliz (None, 8, 8, 512) 2048
ation)
conv_pw_8_relu (ReLU) (None, 8, 8, 512) 0
conv_dw_9 (DepthwiseConv2D) (None, 8, 8, 512) 4608
conv_dw_9_bn (BatchNormaliz (None, 8, 8, 512) 2048
ation)
conv_dw_9_relu (ReLU) (None, 8, 8, 512) 0
conv_pw_9 (Conv2D) (None, 8, 8, 512) 262144
conv_pw_9_bn (BatchNormaliz (None, 8, 8, 512) 2048
ation)
conv_pw_9_relu (ReLU) (None, 8, 8, 512) 0
conv_dw_10 (DepthwiseConv2D (None, 8, 8, 512) 4608
)
conv_dw_10_bn (BatchNormali (None, 8, 8, 512) 2048
zation)
conv_dw_10_relu (ReLU) (None, 8, 8, 512) 0
conv_pw_10 (Conv2D) (None, 8, 8, 512) 262144
conv_pw_10_bn (BatchNormali (None, 8, 8, 512) 2048
zation)
conv_pw_10_relu (ReLU) (None, 8, 8, 512) 0
conv_dw_11 (DepthwiseConv2D (None, 8, 8, 512) 4608
)
conv_dw_11_bn (BatchNormali (None, 8, 8, 512) 2048
zation)
conv_dw_11_relu (ReLU) (None, 8, 8, 512) 0
conv_pw_11 (Conv2D) (None, 8, 8, 512) 262144
conv_pw_11_bn (BatchNormali (None, 8, 8, 512) 2048
zation)
conv_pw_11_relu (ReLU) (None, 8, 8, 512) 0
conv_pad_12 (ZeroPadding2D) (None, 9, 9, 512) 0
conv_dw_12 (DepthwiseConv2D (None, 4, 4, 512) 4608
)
conv_dw_12_bn (BatchNormali (None, 4, 4, 512) 2048
zation)
conv_dw_12_relu (ReLU) (None, 4, 4, 512) 0
conv_pw_12 (Conv2D) (None, 4, 4, 1024) 524288
conv_pw_12_bn (BatchNormali (None, 4, 4, 1024) 4096
zation)
conv_pw_12_relu (ReLU) (None, 4, 4, 1024) 0
conv_dw_13 (DepthwiseConv2D (None, 4, 4, 1024) 9216
)
conv_dw_13_bn (BatchNormali (None, 4, 4, 1024) 4096
zation)
conv_dw_13_relu (ReLU) (None, 4, 4, 1024) 0
conv_pw_13 (Conv2D) (None, 4, 4, 1024) 1048576
conv_pw_13_bn (BatchNormali (None, 4, 4, 1024) 4096
zation)
conv_pw_13_relu (ReLU) (None, 4, 4, 1024) 0
coords (Conv2D) (None, 1, 1, 4) 65540
reshape (Reshape) (None, 4) 0
=================================================================
Total params: 3,294,404
Trainable params: 65,540
Non-trainable params: 3,228,864
_________________________________________________________________
X_train1.shape[0]
2880
X_train2 = np.resize(X_train1,(X_train1.shape[0],128,128,3))
X_train1.shape, X_train2.shape
((2880, 224, 224, 3), (2880, 128, 128, 3))
X_val.shape
(720, 224, 224, 3)
X_val2 = np.resize(X_val,(X_val.shape[0],128,128,3))
X_val.shape,X_val2.shape
((720, 224, 224, 3), (720, 128, 128, 3))
X_test2 = np.resize(X_test,(X_test.shape[0],128,128,3))
X_test.shape, X_test2.shape
((900, 224, 224, 3), (900, 128, 128, 3))
print(X_train2.shape)
print(X_val2.shape)
print(X_test2.shape)
(2880, 128, 128, 3) (720, 128, 128, 3) (900, 128, 128, 3)
y_train1[0:5]
array([[310., 607., 239., 259.],
[568., 481., 194., 173.],
[273., 209., 194., 393.],
[217., 526., 176., 125.],
[558., 280., 243., 482.]])
y_train2 = (y_train1 * 128) / 1024
y_train2[0:5]
array([[38.75 , 75.875, 29.875, 32.375],
[71. , 60.125, 24.25 , 21.625],
[34.125, 26.125, 24.25 , 49.125],
[27.125, 65.75 , 22. , 15.625],
[69.75 , 35. , 30.375, 60.25 ]])
y_val2 = (y_val * 128 )/ 1024
y_val2[0:5]
array([[19.25 , 34.875, 41.75 , 50.375],
[42.125, 58.125, 23.5 , 23.5 ],
[78.25 , 24.625, 43. , 83. ],
[75.375, 35.25 , 27.375, 31.625],
[91.25 , 70. , 13. , 28.875]])
y_test2 = (y_test * 128 )/ 1024
# Compile the model
opt = optimizer=Adam(learning_rate=0.00001, beta_1=0.9, beta_2=0.999, epsilon=1e-07)
base_model1.compile(loss="mean_squared_error", optimizer=opt, metrics=[IoU_2]) # Regression loss is MSE
# Use earlystopping
callback2 = tf.keras.callbacks.EarlyStopping(monitor='val_IoU_2', patience=3, min_delta=0.01)
# Fit the model
base_model1.fit(X_train2, y_train2, validation_data=(X_val2, y_val2), epochs=30, batch_size=20, callbacks=[callback2])
Epoch 1/30 144/144 [==============================] - 20s 127ms/step - loss: 1828.7288 - IoU_2: 0.3005 - val_loss: 1778.9187 - val_IoU_2: 0.2710 Epoch 2/30 144/144 [==============================] - 18s 127ms/step - loss: 1507.3427 - IoU_2: 0.2646 - val_loss: 1301.6140 - val_IoU_2: 0.2105 Epoch 3/30 144/144 [==============================] - 18s 125ms/step - loss: 1015.9293 - IoU_2: 0.1621 - val_loss: 890.5369 - val_IoU_2: 0.1570 Epoch 4/30 144/144 [==============================] - 18s 127ms/step - loss: 729.0668 - IoU_2: 0.1625 - val_loss: 679.5276 - val_IoU_2: 0.1547 Epoch 5/30 144/144 [==============================] - 18s 128ms/step - loss: 575.9411 - IoU_2: 0.1702 - val_loss: 559.8184 - val_IoU_2: 0.1565 Epoch 6/30 144/144 [==============================] - 19s 131ms/step - loss: 491.3201 - IoU_2: 0.1630 - val_loss: 492.2207 - val_IoU_2: 0.1503
<keras.callbacks.History at 0x27ca0111730>
base_model1.evaluate(X_test2, y_test2)
29/29 [==============================] - 5s 164ms/step - loss: 461.3999 - IoU_2: 0.1692
[461.3999328613281, 0.16920344531536102]
pkl.dump(base_model1, open('mobilenet_objdet', 'wb'))
WARNING:absl:Found untraced functions such as _jit_compiled_convolution_op, _jit_compiled_convolution_op, _jit_compiled_convolution_op, _jit_compiled_convolution_op, _jit_compiled_convolution_op while saving (showing 5 of 28). These functions will not be directly callable after loading.
INFO:tensorflow:Assets written to: ram://665fd77b-c745-41cd-8d15-93c098395e0a/assets
INFO:tensorflow:Assets written to: ram://665fd77b-c745-41cd-8d15-93c098395e0a/assets
X_train4 = X_train2/255.0
X_val4 = X_val2/255.0
X_test4 = X_test2/255.0
tf.keras.backend.clear_session()
# Use earlystopping
callback2 = tf.keras.callbacks.EarlyStopping(monitor='val_IoU_2', patience=3, min_delta=0.01)
# Fit the model
base_model1.fit(X_train4, y_train2, validation_data=(X_val4, y_val2), epochs=30, batch_size=20, callbacks=[callback2])
Epoch 1/30 144/144 [==============================] - 20s 130ms/step - loss: 431.2441 - IoU_2: 0.1555 - val_loss: 431.0265 - val_IoU_2: 0.1440 Epoch 2/30 144/144 [==============================] - 16s 112ms/step - loss: 399.6055 - IoU_2: 0.1331 - val_loss: 412.4891 - val_IoU_2: 0.1263 Epoch 3/30 144/144 [==============================] - 16s 112ms/step - loss: 390.5389 - IoU_2: 0.1243 - val_loss: 405.8780 - val_IoU_2: 0.1212 Epoch 4/30 144/144 [==============================] - 16s 110ms/step - loss: 387.4046 - IoU_2: 0.1203 - val_loss: 402.8160 - val_IoU_2: 0.1211 Epoch 5/30 144/144 [==============================] - 16s 110ms/step - loss: 385.9109 - IoU_2: 0.1181 - val_loss: 401.5852 - val_IoU_2: 0.1194
<keras.callbacks.History at 0x282a356ac70>
train_4500_rows.sample(10).style.set_properties(**{'border': '1.3px solid black','color': 'blue'})
| patientId | x | y | width | height | Target | Full_filename | Filename | |
|---|---|---|---|---|---|---|---|---|
| 2785 | 56b5c0cf-8a3c-42f2-b3e2-adb2f550c451 | 81.000000 | 587.000000 | 210.000000 | 222.000000 | 1 | stage_2_train_images/56b5c0cf-8a3c-42f2-b3e2-adb2f550c451.dcm | 56b5c0cf-8a3c-42f2-b3e2-adb2f550c451.dcm |
| 2546 | 4b85f617-4e30-404c-b9c1-5e9b8776dfba | 678.000000 | 290.000000 | 249.000000 | 506.000000 | 1 | stage_2_train_images/4b85f617-4e30-404c-b9c1-5e9b8776dfba.dcm | 4b85f617-4e30-404c-b9c1-5e9b8776dfba.dcm |
| 3112 | 64917123-823f-4c01-a0d6-06bdfd899a6c | 242.000000 | 260.000000 | 240.000000 | 507.000000 | 1 | stage_2_train_images/64917123-823f-4c01-a0d6-06bdfd899a6c.dcm | 64917123-823f-4c01-a0d6-06bdfd899a6c.dcm |
| 908 | 2635357b-8865-4551-9ed8-3f90b996c982 | 225.000000 | 489.000000 | 164.000000 | 115.000000 | 1 | stage_2_train_images/2635357b-8865-4551-9ed8-3f90b996c982.dcm | 2635357b-8865-4551-9ed8-3f90b996c982.dcm |
| 2412 | 463c04a5-94b8-4969-a725-a58c2b6211c1 | 599.000000 | 309.000000 | 282.000000 | 546.000000 | 1 | stage_2_train_images/463c04a5-94b8-4969-a725-a58c2b6211c1.dcm | 463c04a5-94b8-4969-a725-a58c2b6211c1.dcm |
| 4466 | 8d48694a-f998-4043-bc98-63a3c18527f2 | 481.000000 | 232.000000 | 329.000000 | 642.000000 | 1 | stage_2_train_images/8d48694a-f998-4043-bc98-63a3c18527f2.dcm | 8d48694a-f998-4043-bc98-63a3c18527f2.dcm |
| 1883 | 3b94276b-4b65-4339-a629-0500f0171c45 | 621.000000 | 572.000000 | 242.000000 | 194.000000 | 1 | stage_2_train_images/3b94276b-4b65-4339-a629-0500f0171c45.dcm | 3b94276b-4b65-4339-a629-0500f0171c45.dcm |
| 4114 | 84d73e68-ceb9-4bff-94d9-9b97523109ee | 115.000000 | 509.000000 | 370.000000 | 286.000000 | 1 | stage_2_train_images/84d73e68-ceb9-4bff-94d9-9b97523109ee.dcm | 84d73e68-ceb9-4bff-94d9-9b97523109ee.dcm |
| 4378 | 8b252ccb-1a24-47cf-bd2d-b42c6c9a2557 | 235.000000 | 236.000000 | 227.000000 | 627.000000 | 1 | stage_2_train_images/8b252ccb-1a24-47cf-bd2d-b42c6c9a2557.dcm | 8b252ccb-1a24-47cf-bd2d-b42c6c9a2557.dcm |
| 2934 | 5c0bb75f-a4d1-4070-b40d-6fff71138147 | 225.000000 | 634.000000 | 128.000000 | 96.000000 | 1 | stage_2_train_images/5c0bb75f-a4d1-4070-b40d-6fff71138147.dcm | 5c0bb75f-a4d1-4070-b40d-6fff71138147.dcm |
samp_data = train_4500_rows["patientId"].sample(10)
sample_data2 = ['00436515-870c-4b36-a041-de91049b9ab4',
'00704310-78a8-4b38-8475-49f4573b2dbb',
'00aecb01-a116-45a2-956c-08d2fa55433f',
'00c0b293-48e7-4e16-ac76-9269ba535a62',
'00f08de1-517e-4652-a04f-d1dc9ee48593']
print(sample_data2)
['00436515-870c-4b36-a041-de91049b9ab4', '00704310-78a8-4b38-8475-49f4573b2dbb', '00aecb01-a116-45a2-956c-08d2fa55433f', '00c0b293-48e7-4e16-ac76-9269ba535a62', '00f08de1-517e-4652-a04f-d1dc9ee48593']
train_coords = train_4500_rows[["patientId", "x", "y", "width", "height"]]
train_coords.sample(10).style.set_properties(**{'border': '1.3px solid black','color': 'blue'})
| patientId | x | y | width | height | |
|---|---|---|---|---|---|
| 1904 | 3bd6f6e0-c07a-4c22-87e7-53d60de9ad0e | 142.000000 | 265.000000 | 228.000000 | 503.000000 |
| 3244 | 6a82ef54-9e27-4c55-afba-47298ad088c9 | 236.000000 | 512.000000 | 113.000000 | 90.000000 |
| 2912 | 5b72c6d2-e7eb-44a6-b625-78d2ec0008f2 | 170.000000 | 414.000000 | 177.000000 | 455.000000 |
| 932 | 29b9a5ae-56e4-4e15-a78d-cbaeda0c276a | 722.000000 | 567.000000 | 139.000000 | 238.000000 |
| 1351 | 34e283d8-3c14-4d47-8882-fb53ec242f87 | 227.000000 | 207.000000 | 250.000000 | 570.000000 |
| 3022 | 5fe137e2-d416-4f49-b685-6a3fe5367910 | 160.000000 | 408.000000 | 184.000000 | 412.000000 |
| 2645 | 4fa69712-d57b-4ea9-bdf3-40cdc03f02ff | 309.000000 | 230.000000 | 233.000000 | 441.000000 |
| 4038 | 826a4142-b079-4825-85f7-5fd69292e4cf | 453.000000 | 231.000000 | 284.000000 | 590.000000 |
| 2420 | 4668a90c-d352-47d0-a3ac-a54aa8ce053b | 162.000000 | 611.000000 | 230.000000 | 235.000000 |
| 522 | 0f831256-0a69-4fe5-9719-1563d2b6b7b9 | 166.000000 | 535.000000 | 134.000000 | 279.000000 |
train_coords[train_coords["patientId"].isin(sample_data2)].style.set_properties(**{'border': '1.3px solid black','color': 'blue'})
| patientId | x | y | width | height | |
|---|---|---|---|---|---|
| 0 | 00436515-870c-4b36-a041-de91049b9ab4 | 264.000000 | 152.000000 | 213.000000 | 379.000000 |
| 1 | 00436515-870c-4b36-a041-de91049b9ab4 | 562.000000 | 152.000000 | 256.000000 | 453.000000 |
| 2 | 00704310-78a8-4b38-8475-49f4573b2dbb | 323.000000 | 577.000000 | 160.000000 | 104.000000 |
| 3 | 00704310-78a8-4b38-8475-49f4573b2dbb | 695.000000 | 575.000000 | 162.000000 | 137.000000 |
| 4 | 00aecb01-a116-45a2-956c-08d2fa55433f | 288.000000 | 322.000000 | 94.000000 | 135.000000 |
| 5 | 00aecb01-a116-45a2-956c-08d2fa55433f | 547.000000 | 299.000000 | 119.000000 | 165.000000 |
| 6 | 00c0b293-48e7-4e16-ac76-9269ba535a62 | 306.000000 | 544.000000 | 168.000000 | 244.000000 |
| 7 | 00c0b293-48e7-4e16-ac76-9269ba535a62 | 650.000000 | 511.000000 | 206.000000 | 284.000000 |
| 8 | 00f08de1-517e-4652-a04f-d1dc9ee48593 | 181.000000 | 184.000000 | 206.000000 | 506.000000 |
| 9 | 00f08de1-517e-4652-a04f-d1dc9ee48593 | 571.000000 | 275.000000 | 230.000000 | 476.000000 |
train_coords.dtypes
patientId object x float64 y float64 width float64 height float64 dtype: object
def gt_pb(pass_model,data):
passed_sample = data
for loop_1 in passed_sample:
pat_id_1 = loop_1
no_ind = train_coords[train_coords['patientId'] == pat_id_1].index.values
l_ind = len(no_ind)
fname = pat_id_1+".dcm"
image_path = "stage_2_train_images/"+fname
ds = dicom.dcmread(image_path)
img = ds.pixel_array
print("patientid",pat_id_1)
fig, a = plt.subplots(1,1)
fig.set_size_inches(5,5)
X = 0
Y = 0
width = 0
height = 0
rect = []
rect1 = []
k = 0
for j in range(l_ind):
X = train_coords.iloc[no_ind[j],1]
Y = train_coords.iloc[no_ind[j],2]
width = train_coords.iloc[no_ind[j],3]
height = train_coords.iloc[no_ind[j],4]
rect.append(patches.Rectangle((X, Y), width, height,linewidth = 3,edgecolor = 'r',facecolor = 'none'))
a.imshow(img, cmap=plt.cm.bone)
a.add_patch(rect[j])
for k in range(l_ind):
img_for_prediction =img[tf.newaxis,...]
img_pred_128 = np.resize(img,(img_for_prediction.shape[0],128,128,3))
img_pred_region = pass_model.predict(img_pred_128) # Predict the BBox
x0 = (img_pred_region[0,0])
y0 = (img_pred_region[0,1])
x1 = (img_pred_region[0,2])
y1 = (img_pred_region[0,3])
rect1.append(patches.Rectangle((x0*8, y0*8), x1*8, y1*8, linewidth=3, edgecolor='y', facecolor='none'))
a.add_patch(rect1[k])
plt.show()
plt.figure().clear()
gt_pb(base_model1, sample_data2)
patientid 00436515-870c-4b36-a041-de91049b9ab4 1/1 [==============================] - 0s 431ms/step 1/1 [==============================] - 0s 64ms/step
patientid 00704310-78a8-4b38-8475-49f4573b2dbb 1/1 [==============================] - 0s 75ms/step 1/1 [==============================] - 0s 65ms/step
<Figure size 640x480 with 0 Axes>
patientid 00aecb01-a116-45a2-956c-08d2fa55433f 1/1 [==============================] - 0s 72ms/step 1/1 [==============================] - 0s 64ms/step
<Figure size 640x480 with 0 Axes>
patientid 00c0b293-48e7-4e16-ac76-9269ba535a62 1/1 [==============================] - 0s 68ms/step 1/1 [==============================] - 0s 63ms/step
<Figure size 640x480 with 0 Axes>
patientid 00f08de1-517e-4652-a04f-d1dc9ee48593 1/1 [==============================] - 0s 63ms/step 1/1 [==============================] - 0s 62ms/step
<Figure size 640x480 with 0 Axes>
<Figure size 640x480 with 0 Axes>
samp_data_arr = np.array(samp_data)
samp_data_arr
array(['47cb6d42-b61b-4cdc-b865-b26f2196f979',
'729f2aa0-9564-4228-b516-1d8d4be8bb55',
'78c9b88e-a134-4470-b161-22e4a698206c',
'2b70a757-4dff-4d97-baf5-f41b8d3ab4da',
'88b5a0c7-4f3b-4a3a-9882-617c1b87c187',
'05fe7d73-5c8c-4349-88c4-2071a00d6d81',
'15cfe136-58ea-4e38-b3a8-d372ecad4ac7',
'3162d754-7972-47b0-ae60-1ae6e2bb4150',
'7bb1c647-d8d1-4640-b495-f41c4d2a34ca',
'81d9d146-e9bb-478b-aeed-c2431780030e'], dtype=object)
gt_pb(base_model1, samp_data_arr)
patientid 47cb6d42-b61b-4cdc-b865-b26f2196f979 1/1 [==============================] - 0s 68ms/step 1/1 [==============================] - 0s 67ms/step
patientid 729f2aa0-9564-4228-b516-1d8d4be8bb55 1/1 [==============================] - 0s 67ms/step
<Figure size 640x480 with 0 Axes>
patientid 78c9b88e-a134-4470-b161-22e4a698206c 1/1 [==============================] - 0s 71ms/step 1/1 [==============================] - 0s 65ms/step 1/1 [==============================] - 0s 61ms/step
<Figure size 640x480 with 0 Axes>
patientid 2b70a757-4dff-4d97-baf5-f41b8d3ab4da 1/1 [==============================] - 0s 84ms/step 1/1 [==============================] - 0s 65ms/step
<Figure size 640x480 with 0 Axes>
patientid 88b5a0c7-4f3b-4a3a-9882-617c1b87c187 1/1 [==============================] - 0s 71ms/step 1/1 [==============================] - 0s 68ms/step
<Figure size 640x480 with 0 Axes>
patientid 05fe7d73-5c8c-4349-88c4-2071a00d6d81 1/1 [==============================] - 0s 76ms/step 1/1 [==============================] - 0s 76ms/step
<Figure size 640x480 with 0 Axes>
patientid 15cfe136-58ea-4e38-b3a8-d372ecad4ac7 1/1 [==============================] - 0s 71ms/step 1/1 [==============================] - 0s 73ms/step
<Figure size 640x480 with 0 Axes>
patientid 3162d754-7972-47b0-ae60-1ae6e2bb4150 1/1 [==============================] - 0s 69ms/step
<Figure size 640x480 with 0 Axes>
patientid 7bb1c647-d8d1-4640-b495-f41c4d2a34ca 1/1 [==============================] - 0s 87ms/step
<Figure size 640x480 with 0 Axes>
patientid 81d9d146-e9bb-478b-aeed-c2431780030e 1/1 [==============================] - 0s 74ms/step 1/1 [==============================] - 0s 81ms/step
<Figure size 640x480 with 0 Axes>
<Figure size 640x480 with 0 Axes>
# path of pretrained models weights
weights_path = "brucechou1983_CheXNet_Keras_0.3.0_weights.h5"
# from tensorflow.keras.applications.densenet.DenseNet import Densenet121
from tensorflow.keras.models import Model
from tensorflow.keras.layers import Conv2D, Reshape
def create_model_new(trainable=True):
model = tf.keras.applications.DenseNet121(weights=None,include_top=False,input_shape=(128,128,3))
model.load_weights(weights_path,by_name= True)
# to freeze layers, except the new top layer, of course, which will be added below
for layer in model.layers:
layer.trainable = trainable
# Add new top layer which is a conv layer of the same size as the previous layer so that only 4 coords of BBox can be output
x0 = model.layers[-1].output
x1 = Conv2D(4, kernel_size=4, kernel_initializer='normal', kernel_regularizer=tf.keras.regularizers.l2(0.0015),
activity_regularizer=tf.keras.regularizers.l1(0.0025), activation="relu", name="coords")(x0)
x2 = Reshape((4,))(x1) # These are the 4 predicted coordinates of one BBox
return Model(inputs=model.input, outputs=x2)
base_model2 = create_model_new(False)
# Compile the model
opt = Adam(learning_rate=0.00001, beta_1=0.9, beta_2=0.999, epsilon=1e-07)
base_model2.compile(loss="mean_squared_error", optimizer=opt, metrics=[IoU_2]) # Regression loss is MSE
# Use earlystopping
callback2 = tf.keras.callbacks.EarlyStopping(monitor='val_IoU_2', patience=5, min_delta=0.01)
# Fit the model
base_model2.fit(X_train2, y_train2, validation_data=(X_val2, y_val2), epochs=40, batch_size=20, callbacks=[callback2])
Epoch 1/40 144/144 [==============================] - 500s 3s/step - loss: 2006.0892 - IoU_2: 0.1283 - val_loss: 845.5762 - val_IoU_2: 0.1521 Epoch 2/40 144/144 [==============================] - 476s 3s/step - loss: 757.3915 - IoU_2: 0.1580 - val_loss: 740.6445 - val_IoU_2: 0.1551 Epoch 3/40 144/144 [==============================] - 467s 3s/step - loss: 699.4479 - IoU_2: 0.1572 - val_loss: 704.5209 - val_IoU_2: 0.1522 Epoch 4/40 144/144 [==============================] - 462s 3s/step - loss: 667.6571 - IoU_2: 0.1515 - val_loss: 682.9375 - val_IoU_2: 0.1499 Epoch 5/40 144/144 [==============================] - 466s 3s/step - loss: 645.8419 - IoU_2: 0.1494 - val_loss: 666.6833 - val_IoU_2: 0.1383 Epoch 6/40 144/144 [==============================] - 392s 3s/step - loss: 630.7343 - IoU_2: 0.1424 - val_loss: 647.8849 - val_IoU_2: 0.1496 Epoch 7/40 144/144 [==============================] - 284s 2s/step - loss: 618.7048 - IoU_2: 0.1416 - val_loss: 648.3870 - val_IoU_2: 0.1419 Epoch 8/40 144/144 [==============================] - 281s 2s/step - loss: 608.3607 - IoU_2: 0.1426 - val_loss: 655.6977 - val_IoU_2: 0.1450 Epoch 9/40 144/144 [==============================] - 282s 2s/step - loss: 601.9389 - IoU_2: 0.1427 - val_loss: 637.2079 - val_IoU_2: 0.1454 Epoch 10/40 144/144 [==============================] - 281s 2s/step - loss: 594.9384 - IoU_2: 0.1425 - val_loss: 622.7687 - val_IoU_2: 0.1504
<keras.callbacks.History at 0x1d834336b50>
base_model2.evaluate(X_test2, y_test2)
29/29 [==============================] - 58s 2s/step - loss: 598.5198 - IoU_2: 0.1362
[598.519775390625, 0.1362275779247284]
y_pred = base_model2.predict(X_test2)
29/29 [==============================] - 57s 2s/step
pkl.dump(base_model2, open('densenet_objdet', 'wb'))
WARNING:absl:Found untraced functions such as _jit_compiled_convolution_op, _jit_compiled_convolution_op, _jit_compiled_convolution_op, _jit_compiled_convolution_op, _jit_compiled_convolution_op while saving (showing 5 of 121). These functions will not be directly callable after loading.
INFO:tensorflow:Assets written to: ram://2703cdd6-e8d6-4f40-b48b-6cfbe5a12d57/assets
INFO:tensorflow:Assets written to: ram://2703cdd6-e8d6-4f40-b48b-6cfbe5a12d57/assets
print(y_test2[1], y_pred[1],'\n')
print(y_test2[11], y_pred[11],'\n')
print(y_test2[111], y_pred[111],'\n')
print(y_test2[21], y_pred[21],'\n')
[75.75 39. 25.125 60. ] [45.190815 48.243572 7.9350333 55.637497 ] [69.625 27.125 12.375 18.75 ] [54.278156 52.235233 0. 41.434162] [34.5 55. 33.875 60.125] [46.794376 39.954098 0. 50.356937] [68.25 24.625 32.25 54.875] [16.15029 52.620552 32.511173 29.475729]
gt_pb(base_model2, sample_data2)
patientid 00436515-870c-4b36-a041-de91049b9ab4 1/1 [==============================] - 1s 942ms/step 1/1 [==============================] - 1s 843ms/step
patientid 00704310-78a8-4b38-8475-49f4573b2dbb 1/1 [==============================] - 1s 869ms/step 1/1 [==============================] - 1s 764ms/step
<Figure size 640x480 with 0 Axes>
patientid 00aecb01-a116-45a2-956c-08d2fa55433f 1/1 [==============================] - 1s 772ms/step 1/1 [==============================] - 1s 857ms/step
<Figure size 640x480 with 0 Axes>
patientid 00c0b293-48e7-4e16-ac76-9269ba535a62 1/1 [==============================] - 1s 833ms/step 1/1 [==============================] - 1s 953ms/step
<Figure size 640x480 with 0 Axes>
patientid 00f08de1-517e-4652-a04f-d1dc9ee48593 1/1 [==============================] - 1s 892ms/step 1/1 [==============================] - 1s 766ms/step
<Figure size 640x480 with 0 Axes>
<Figure size 640x480 with 0 Axes>
gt_pb(base_model2, samp_data_arr)
patientid 47cb6d42-b61b-4cdc-b865-b26f2196f979 1/1 [==============================] - 1s 823ms/step 1/1 [==============================] - 1s 740ms/step
patientid 729f2aa0-9564-4228-b516-1d8d4be8bb55 1/1 [==============================] - 1s 963ms/step
<Figure size 640x480 with 0 Axes>
patientid 78c9b88e-a134-4470-b161-22e4a698206c 1/1 [==============================] - 1s 807ms/step 1/1 [==============================] - 1s 892ms/step 1/1 [==============================] - 1s 788ms/step
<Figure size 640x480 with 0 Axes>
patientid 2b70a757-4dff-4d97-baf5-f41b8d3ab4da 1/1 [==============================] - 1s 819ms/step 1/1 [==============================] - 1s 824ms/step
<Figure size 640x480 with 0 Axes>
patientid 88b5a0c7-4f3b-4a3a-9882-617c1b87c187 1/1 [==============================] - 1s 858ms/step 1/1 [==============================] - 1s 878ms/step
<Figure size 640x480 with 0 Axes>
patientid 05fe7d73-5c8c-4349-88c4-2071a00d6d81 1/1 [==============================] - 1s 778ms/step 1/1 [==============================] - 1s 809ms/step
<Figure size 640x480 with 0 Axes>
patientid 15cfe136-58ea-4e38-b3a8-d372ecad4ac7 1/1 [==============================] - 1s 877ms/step 1/1 [==============================] - 1s 816ms/step
<Figure size 640x480 with 0 Axes>
patientid 3162d754-7972-47b0-ae60-1ae6e2bb4150 1/1 [==============================] - 1s 770ms/step
<Figure size 640x480 with 0 Axes>
patientid 7bb1c647-d8d1-4640-b495-f41c4d2a34ca 1/1 [==============================] - 1s 1s/step
<Figure size 640x480 with 0 Axes>
patientid 81d9d146-e9bb-478b-aeed-c2431780030e 1/1 [==============================] - 1s 937ms/step 1/1 [==============================] - 1s 835ms/step
<Figure size 640x480 with 0 Axes>
<Figure size 640x480 with 0 Axes>
!pip install pycocotools
Requirement already satisfied: pycocotools in c:\users\thril\anaconda3\lib\site-packages (2.0.6) Requirement already satisfied: numpy in c:\users\thril\anaconda3\lib\site-packages (from pycocotools) (1.21.5) Requirement already satisfied: matplotlib>=2.1.0 in c:\users\thril\anaconda3\lib\site-packages (from pycocotools) (3.5.2) Requirement already satisfied: packaging>=20.0 in c:\users\thril\anaconda3\lib\site-packages (from matplotlib>=2.1.0->pycocotools) (21.3) Requirement already satisfied: pyparsing>=2.2.1 in c:\users\thril\anaconda3\lib\site-packages (from matplotlib>=2.1.0->pycocotools) (2.4.7) Requirement already satisfied: cycler>=0.10 in c:\users\thril\anaconda3\lib\site-packages (from matplotlib>=2.1.0->pycocotools) (0.11.0) Requirement already satisfied: fonttools>=4.22.0 in c:\users\thril\anaconda3\lib\site-packages (from matplotlib>=2.1.0->pycocotools) (4.25.0) Requirement already satisfied: pillow>=6.2.0 in c:\users\thril\anaconda3\lib\site-packages (from matplotlib>=2.1.0->pycocotools) (9.2.0) Requirement already satisfied: kiwisolver>=1.0.1 in c:\users\thril\anaconda3\lib\site-packages (from matplotlib>=2.1.0->pycocotools) (1.4.2) Requirement already satisfied: python-dateutil>=2.7 in c:\users\thril\anaconda3\lib\site-packages (from matplotlib>=2.1.0->pycocotools) (2.8.2) Requirement already satisfied: six>=1.5 in c:\users\thril\anaconda3\lib\site-packages (from python-dateutil>=2.7->matplotlib>=2.1.0->pycocotools) (1.16.0)
!git clone --depth 1 https://github.com/tensorflow/models
fatal: destination path 'models' already exists and is not an empty directory.
%%cmd
cd models/research/
protoc object_detection/protos/*.proto --python_out=.
Microsoft Windows [Version 10.0.22621.819] (c) Microsoft Corporation. All rights reserved. C:\Users\thril>cd models/research/ C:\Users\thril\models\research>protoc object_detection/protos/*.proto --python_out=. C:\Users\thril\models\research>
!cd models
os.chdir("models/research")
%%cmd
cd models/research/
protoc object_detection/protos/*.proto --python_out=.
cp object_detection/packages/tf2/setup.py .
python -m pip install .
Microsoft Windows [Version 10.0.22621.819]
(c) Microsoft Corporation. All rights reserved.
C:\Users\thril\models\research>cd models/research/
C:\Users\thril\models\research>protoc object_detection/protos/*.proto --python_out=.
C:\Users\thril\models\research>cp object_detection/packages/tf2/setup.py .
C:\Users\thril\models\research>python -m pip install .
Processing c:\users\thril\models\research
Preparing metadata (setup.py): started
Preparing metadata (setup.py): finished with status 'done'
Collecting avro-python3
Using cached avro_python3-1.10.2-py3-none-any.whl
Collecting apache-beam
Using cached apache_beam-2.43.0-cp39-cp39-win_amd64.whl (4.5 MB)
Requirement already satisfied: pillow in c:\users\thril\anaconda3\lib\site-packages (from object-detection==0.1) (9.2.0)
Requirement already satisfied: lxml in c:\users\thril\anaconda3\lib\site-packages (from object-detection==0.1) (4.9.1)
Requirement already satisfied: matplotlib in c:\users\thril\anaconda3\lib\site-packages (from object-detection==0.1) (3.5.2)
Requirement already satisfied: Cython in c:\users\thril\anaconda3\lib\site-packages (from object-detection==0.1) (0.29.32)
Requirement already satisfied: contextlib2 in c:\users\thril\anaconda3\lib\site-packages (from object-detection==0.1) (21.6.0)
Requirement already satisfied: tf-slim in c:\users\thril\anaconda3\lib\site-packages (from object-detection==0.1) (1.1.0)
Requirement already satisfied: six in c:\users\thril\anaconda3\lib\site-packages (from object-detection==0.1) (1.16.0)
Requirement already satisfied: pycocotools in c:\users\thril\anaconda3\lib\site-packages (from object-detection==0.1) (2.0.6)
Collecting lvis
Using cached lvis-0.5.3-py3-none-any.whl (14 kB)
Requirement already satisfied: scipy in c:\users\thril\anaconda3\lib\site-packages (from object-detection==0.1) (1.9.1)
Requirement already satisfied: pandas in c:\users\thril\anaconda3\lib\site-packages (from object-detection==0.1) (1.4.4)
Collecting tf-models-official>=2.5.1
Using cached tf_models_official-2.11.0-py2.py3-none-any.whl (2.3 MB)
Requirement already satisfied: tensorflow_io in c:\users\thril\anaconda3\lib\site-packages (from object-detection==0.1) (0.28.0)
Requirement already satisfied: keras in c:\users\thril\anaconda3\lib\site-packages (from object-detection==0.1) (2.10.0)
Requirement already satisfied: pyparsing==2.4.7 in c:\users\thril\anaconda3\lib\site-packages (from object-detection==0.1) (2.4.7)
Collecting sacrebleu<=2.2.0
Using cached sacrebleu-2.2.0-py3-none-any.whl (116 kB)
Requirement already satisfied: colorama in c:\users\thril\anaconda3\lib\site-packages (from sacrebleu<=2.2.0->object-detection==0.1) (0.4.5)
Requirement already satisfied: numpy>=1.17 in c:\users\thril\anaconda3\lib\site-packages (from sacrebleu<=2.2.0->object-detection==0.1) (1.21.5)
Requirement already satisfied: regex in c:\users\thril\anaconda3\lib\site-packages (from sacrebleu<=2.2.0->object-detection==0.1) (2022.7.9)
Requirement already satisfied: tabulate>=0.8.9 in c:\users\thril\anaconda3\lib\site-packages (from sacrebleu<=2.2.0->object-detection==0.1) (0.8.10)
Requirement already satisfied: portalocker in c:\users\thril\anaconda3\lib\site-packages (from sacrebleu<=2.2.0->object-detection==0.1) (2.6.0)
Requirement already satisfied: gin-config in c:\users\thril\anaconda3\lib\site-packages (from tf-models-official>=2.5.1->object-detection==0.1) (0.5.0)
Collecting immutabledict
Using cached immutabledict-2.2.3-py3-none-any.whl (4.0 kB)
Collecting seqeval
Using cached seqeval-1.2.2-py3-none-any.whl
Collecting tensorflow~=2.11.0
Using cached tensorflow-2.11.0-cp39-cp39-win_amd64.whl (1.9 kB)
Collecting tf-models-official>=2.5.1
Using cached tf_models_official-2.10.1-py2.py3-none-any.whl (2.2 MB)
Collecting opencv-python-headless
Using cached opencv_python_headless-4.6.0.66-cp36-abi3-win_amd64.whl (35.5 MB)
Collecting tensorflow-text~=2.10.0
Using cached tensorflow_text-2.10.0-cp39-cp39-win_amd64.whl (5.0 MB)
Collecting oauth2client
Using cached oauth2client-4.1.3-py2.py3-none-any.whl (98 kB)
Requirement already satisfied: sentencepiece in c:\users\thril\anaconda3\lib\site-packages (from tf-models-official>=2.5.1->object-detection==0.1) (0.1.97)
Collecting kaggle>=1.3.9
Using cached kaggle-1.5.12-py3-none-any.whl
Collecting google-api-python-client>=1.6.7
Using cached google_api_python_client-2.66.0-py2.py3-none-any.whl (10.5 MB)
Requirement already satisfied: tensorflow-model-optimization>=0.4.1 in c:\users\thril\anaconda3\lib\site-packages (from tf-models-official>=2.5.1->object-detection==0.1) (0.7.3)
Requirement already satisfied: tensorflow-hub>=0.6.0 in c:\users\thril\anaconda3\lib\site-packages (from tf-models-official>=2.5.1->object-detection==0.1) (0.12.0)
Requirement already satisfied: py-cpuinfo>=3.3.0 in c:\users\thril\anaconda3\lib\site-packages (from tf-models-official>=2.5.1->object-detection==0.1) (9.0.0)
Requirement already satisfied: tensorflow~=2.10.0 in c:\users\thril\anaconda3\lib\site-packages (from tf-models-official>=2.5.1->object-detection==0.1) (2.10.0)
Collecting tensorflow-addons
Using cached tensorflow_addons-0.18.0-cp39-cp39-win_amd64.whl (765 kB)
Collecting tensorflow-datasets
Using cached tensorflow_datasets-4.7.0-py3-none-any.whl (4.7 MB)
Requirement already satisfied: psutil>=5.4.3 in c:\users\thril\anaconda3\lib\site-packages (from tf-models-official>=2.5.1->object-detection==0.1) (5.9.0)
Requirement already satisfied: pyyaml<6.0,>=5.1 in c:\users\thril\anaconda3\lib\site-packages (from tf-models-official>=2.5.1->object-detection==0.1) (5.4.1)
Requirement already satisfied: pytz>=2020.1 in c:\users\thril\anaconda3\lib\site-packages (from pandas->object-detection==0.1) (2022.1)
Requirement already satisfied: python-dateutil>=2.8.1 in c:\users\thril\anaconda3\lib\site-packages (from pandas->object-detection==0.1) (2.8.2)
Requirement already satisfied: absl-py>=0.2.2 in c:\users\thril\anaconda3\lib\site-packages (from tf-slim->object-detection==0.1) (1.3.0)
Collecting fasteners<1.0,>=0.3
Using cached fasteners-0.18-py3-none-any.whl (18 kB)
Requirement already satisfied: grpcio!=1.48.0,<2,>=1.33.1 in c:\users\thril\anaconda3\lib\site-packages (from apache-beam->object-detection==0.1) (1.50.0)
Requirement already satisfied: crcmod<2.0,>=1.7 in c:\users\thril\anaconda3\lib\site-packages (from apache-beam->object-detection==0.1) (1.7)
Requirement already satisfied: requests<3.0.0,>=2.24.0 in c:\users\thril\anaconda3\lib\site-packages (from apache-beam->object-detection==0.1) (2.28.1)
Collecting objsize<0.6.0,>=0.5.2
Using cached objsize-0.5.2-py3-none-any.whl (8.2 kB)
Collecting proto-plus<2,>=1.7.1
Using cached proto_plus-1.22.1-py3-none-any.whl (47 kB)
Requirement already satisfied: typing-extensions>=3.7.0 in c:\users\thril\anaconda3\lib\site-packages (from apache-beam->object-detection==0.1) (4.3.0)
Requirement already satisfied: pymongo<4.0.0,>=3.8.0 in c:\users\thril\anaconda3\lib\site-packages (from apache-beam->object-detection==0.1) (3.13.0)
Collecting dill<0.3.2,>=0.3.1.1
Using cached dill-0.3.1.1-py3-none-any.whl
Collecting cloudpickle~=2.2.0
Using cached cloudpickle-2.2.0-py3-none-any.whl (25 kB)
Requirement already satisfied: pyarrow<10.0.0,>=0.15.1 in c:\users\thril\anaconda3\lib\site-packages (from apache-beam->object-detection==0.1) (9.0.0)
Collecting orjson<4.0
Using cached orjson-3.8.2-cp39-none-win_amd64.whl (200 kB)
Collecting hdfs<3.0.0,>=2.1.0
Using cached hdfs-2.7.0-py3-none-any.whl (34 kB)
Requirement already satisfied: zstandard<1,>=0.18.0 in c:\users\thril\anaconda3\lib\site-packages (from apache-beam->object-detection==0.1) (0.19.0)
Requirement already satisfied: pydot<2,>=1.2.0 in c:\users\thril\anaconda3\lib\site-packages (from apache-beam->object-detection==0.1) (1.4.2)
Requirement already satisfied: protobuf<4,>3.12.2 in c:\users\thril\anaconda3\lib\site-packages (from apache-beam->object-detection==0.1) (3.20.1)
Collecting fastavro<2,>=0.23.6
Using cached fastavro-1.7.0-cp39-cp39-win_amd64.whl (451 kB)
Collecting httplib2<0.21.0,>=0.8
Using cached httplib2-0.20.4-py3-none-any.whl (96 kB)
Requirement already satisfied: cycler>=0.10.0 in c:\users\thril\anaconda3\lib\site-packages (from lvis->object-detection==0.1) (0.11.0)
Collecting opencv-python>=4.1.0.25
Using cached opencv_python-4.6.0.66-cp36-abi3-win_amd64.whl (35.6 MB)
Requirement already satisfied: kiwisolver>=1.1.0 in c:\users\thril\anaconda3\lib\site-packages (from lvis->object-detection==0.1) (1.4.2)
Requirement already satisfied: packaging>=20.0 in c:\users\thril\anaconda3\lib\site-packages (from matplotlib->object-detection==0.1) (21.3)
Requirement already satisfied: fonttools>=4.22.0 in c:\users\thril\anaconda3\lib\site-packages (from matplotlib->object-detection==0.1) (4.25.0)
Requirement already satisfied: tensorflow-io-gcs-filesystem==0.28.0 in c:\users\thril\anaconda3\lib\site-packages (from tensorflow_io->object-detection==0.1) (0.28.0)
Requirement already satisfied: google-auth<3.0.0dev,>=1.19.0 in c:\users\thril\anaconda3\lib\site-packages (from google-api-python-client>=1.6.7->tf-models-official>=2.5.1->object-detection==0.1) (2.14.0)
Collecting google-api-core!=2.0.*,!=2.1.*,!=2.2.*,!=2.3.0,<3.0.0dev,>=1.31.5
Using cached google_api_core-2.10.2-py3-none-any.whl (115 kB)
Collecting google-auth-httplib2>=0.1.0
Using cached google_auth_httplib2-0.1.0-py2.py3-none-any.whl (9.3 kB)
Requirement already satisfied: uritemplate<5,>=3.0.1 in c:\users\thril\anaconda3\lib\site-packages (from google-api-python-client>=1.6.7->tf-models-official>=2.5.1->object-detection==0.1) (4.1.1)
Requirement already satisfied: docopt in c:\users\thril\anaconda3\lib\site-packages (from hdfs<3.0.0,>=2.1.0->apache-beam->object-detection==0.1) (0.6.2)
Requirement already satisfied: tqdm in c:\users\thril\anaconda3\lib\site-packages (from kaggle>=1.3.9->tf-models-official>=2.5.1->object-detection==0.1) (4.64.1)
Requirement already satisfied: python-slugify in c:\users\thril\anaconda3\lib\site-packages (from kaggle>=1.3.9->tf-models-official>=2.5.1->object-detection==0.1) (5.0.2)
Requirement already satisfied: certifi in c:\users\thril\anaconda3\lib\site-packages (from kaggle>=1.3.9->tf-models-official>=2.5.1->object-detection==0.1) (2022.9.24)
Requirement already satisfied: urllib3 in c:\users\thril\anaconda3\lib\site-packages (from kaggle>=1.3.9->tf-models-official>=2.5.1->object-detection==0.1) (1.26.11)
Requirement already satisfied: charset-normalizer<3,>=2 in c:\users\thril\anaconda3\lib\site-packages (from requests<3.0.0,>=2.24.0->apache-beam->object-detection==0.1) (2.0.4)
Requirement already satisfied: idna<4,>=2.5 in c:\users\thril\anaconda3\lib\site-packages (from requests<3.0.0,>=2.24.0->apache-beam->object-detection==0.1) (3.3)
Requirement already satisfied: tensorflow-estimator<2.11,>=2.10.0 in c:\users\thril\anaconda3\lib\site-packages (from tensorflow~=2.10.0->tf-models-official>=2.5.1->object-detection==0.1) (2.10.0)
Requirement already satisfied: tensorboard<2.11,>=2.10 in c:\users\thril\anaconda3\lib\site-packages (from tensorflow~=2.10.0->tf-models-official>=2.5.1->object-detection==0.1) (2.10.1)
Requirement already satisfied: gast<=0.4.0,>=0.2.1 in c:\users\thril\anaconda3\lib\site-packages (from tensorflow~=2.10.0->tf-models-official>=2.5.1->object-detection==0.1) (0.4.0)
Requirement already satisfied: google-pasta>=0.1.1 in c:\users\thril\anaconda3\lib\site-packages (from tensorflow~=2.10.0->tf-models-official>=2.5.1->object-detection==0.1) (0.2.0)
Requirement already satisfied: termcolor>=1.1.0 in c:\users\thril\anaconda3\lib\site-packages (from tensorflow~=2.10.0->tf-models-official>=2.5.1->object-detection==0.1) (2.1.0)
Requirement already satisfied: wrapt>=1.11.0 in c:\users\thril\anaconda3\lib\site-packages (from tensorflow~=2.10.0->tf-models-official>=2.5.1->object-detection==0.1) (1.14.1)
Collecting protobuf<4,>3.12.2
Using cached protobuf-3.19.6-cp39-cp39-win_amd64.whl (895 kB)
Requirement already satisfied: flatbuffers>=2.0 in c:\users\thril\anaconda3\lib\site-packages (from tensorflow~=2.10.0->tf-models-official>=2.5.1->object-detection==0.1) (22.10.26)
Requirement already satisfied: h5py>=2.9.0 in c:\users\thril\anaconda3\lib\site-packages (from tensorflow~=2.10.0->tf-models-official>=2.5.1->object-detection==0.1) (3.7.0)
Requirement already satisfied: libclang>=13.0.0 in c:\users\thril\anaconda3\lib\site-packages (from tensorflow~=2.10.0->tf-models-official>=2.5.1->object-detection==0.1) (14.0.6)
Requirement already satisfied: astunparse>=1.6.0 in c:\users\thril\anaconda3\lib\site-packages (from tensorflow~=2.10.0->tf-models-official>=2.5.1->object-detection==0.1) (1.6.3)
Requirement already satisfied: setuptools in c:\users\thril\anaconda3\lib\site-packages (from tensorflow~=2.10.0->tf-models-official>=2.5.1->object-detection==0.1) (63.4.1)
Requirement already satisfied: keras-preprocessing>=1.1.1 in c:\users\thril\anaconda3\lib\site-packages (from tensorflow~=2.10.0->tf-models-official>=2.5.1->object-detection==0.1) (1.1.2)
Requirement already satisfied: opt-einsum>=2.3.2 in c:\users\thril\anaconda3\lib\site-packages (from tensorflow~=2.10.0->tf-models-official>=2.5.1->object-detection==0.1) (3.3.0)
Requirement already satisfied: dm-tree~=0.1.1 in c:\users\thril\anaconda3\lib\site-packages (from tensorflow-model-optimization>=0.4.1->tf-models-official>=2.5.1->object-detection==0.1) (0.1.7)
Requirement already satisfied: rsa>=3.1.4 in c:\users\thril\anaconda3\lib\site-packages (from oauth2client->tf-models-official>=2.5.1->object-detection==0.1) (4.9)
Requirement already satisfied: pyasn1>=0.1.7 in c:\users\thril\anaconda3\lib\site-packages (from oauth2client->tf-models-official>=2.5.1->object-detection==0.1) (0.4.8)
Requirement already satisfied: pyasn1-modules>=0.0.5 in c:\users\thril\anaconda3\lib\site-packages (from oauth2client->tf-models-official>=2.5.1->object-detection==0.1) (0.2.8)
Requirement already satisfied: pywin32>=226 in c:\users\thril\anaconda3\lib\site-packages (from portalocker->sacrebleu<=2.2.0->object-detection==0.1) (302)
Requirement already satisfied: scikit-learn>=0.21.3 in c:\users\thril\anaconda3\lib\site-packages (from seqeval->tf-models-official>=2.5.1->object-detection==0.1) (1.0.2)
Requirement already satisfied: typeguard>=2.7 in c:\users\thril\anaconda3\lib\site-packages (from tensorflow-addons->tf-models-official>=2.5.1->object-detection==0.1) (2.13.3)
Collecting tensorflow-metadata
Using cached tensorflow_metadata-1.11.0-py3-none-any.whl (52 kB)
Requirement already satisfied: promise in c:\users\thril\anaconda3\lib\site-packages (from tensorflow-datasets->tf-models-official>=2.5.1->object-detection==0.1) (2.3)
Requirement already satisfied: toml in c:\users\thril\anaconda3\lib\site-packages (from tensorflow-datasets->tf-models-official>=2.5.1->object-detection==0.1) (0.10.2)
Collecting etils[epath]
Using cached etils-0.9.0-py3-none-any.whl (140 kB)
Requirement already satisfied: wheel<1.0,>=0.23.0 in c:\users\thril\anaconda3\lib\site-packages (from astunparse>=1.6.0->tensorflow~=2.10.0->tf-models-official>=2.5.1->object-detection==0.1) (0.37.1)
Collecting googleapis-common-protos<2.0dev,>=1.56.2
Using cached googleapis_common_protos-1.57.0-py2.py3-none-any.whl (217 kB)
Requirement already satisfied: cachetools<6.0,>=2.0.0 in c:\users\thril\anaconda3\lib\site-packages (from google-auth<3.0.0dev,>=1.19.0->google-api-python-client>=1.6.7->tf-models-official>=2.5.1->object-detection==0.1) (5.2.0)
Requirement already satisfied: threadpoolctl>=2.0.0 in c:\users\thril\anaconda3\lib\site-packages (from scikit-learn>=0.21.3->seqeval->tf-models-official>=2.5.1->object-detection==0.1) (2.2.0)
Requirement already satisfied: joblib>=0.11 in c:\users\thril\anaconda3\lib\site-packages (from scikit-learn>=0.21.3->seqeval->tf-models-official>=2.5.1->object-detection==0.1) (1.1.0)
Requirement already satisfied: tensorboard-data-server<0.7.0,>=0.6.0 in c:\users\thril\anaconda3\lib\site-packages (from tensorboard<2.11,>=2.10->tensorflow~=2.10.0->tf-models-official>=2.5.1->object-detection==0.1) (0.6.1)
Requirement already satisfied: werkzeug>=1.0.1 in c:\users\thril\anaconda3\lib\site-packages (from tensorboard<2.11,>=2.10->tensorflow~=2.10.0->tf-models-official>=2.5.1->object-detection==0.1) (2.0.3)
Requirement already satisfied: markdown>=2.6.8 in c:\users\thril\anaconda3\lib\site-packages (from tensorboard<2.11,>=2.10->tensorflow~=2.10.0->tf-models-official>=2.5.1->object-detection==0.1) (3.3.4)
Requirement already satisfied: google-auth-oauthlib<0.5,>=0.4.1 in c:\users\thril\anaconda3\lib\site-packages (from tensorboard<2.11,>=2.10->tensorflow~=2.10.0->tf-models-official>=2.5.1->object-detection==0.1) (0.4.6)
Requirement already satisfied: tensorboard-plugin-wit>=1.6.0 in c:\users\thril\anaconda3\lib\site-packages (from tensorboard<2.11,>=2.10->tensorflow~=2.10.0->tf-models-official>=2.5.1->object-detection==0.1) (1.8.1)
Collecting importlib_resources
Using cached importlib_resources-5.10.0-py3-none-any.whl (34 kB)
Requirement already satisfied: zipp in c:\users\thril\anaconda3\lib\site-packages (from etils[epath]->tensorflow-datasets->tf-models-official>=2.5.1->object-detection==0.1) (3.8.0)
Requirement already satisfied: text-unidecode>=1.3 in c:\users\thril\anaconda3\lib\site-packages (from python-slugify->kaggle>=1.3.9->tf-models-official>=2.5.1->object-detection==0.1) (1.3)
Requirement already satisfied: requests-oauthlib>=0.7.0 in c:\users\thril\anaconda3\lib\site-packages (from google-auth-oauthlib<0.5,>=0.4.1->tensorboard<2.11,>=2.10->tensorflow~=2.10.0->tf-models-official>=2.5.1->object-detection==0.1) (1.3.1)
Requirement already satisfied: oauthlib>=3.0.0 in c:\users\thril\anaconda3\lib\site-packages (from requests-oauthlib>=0.7.0->google-auth-oauthlib<0.5,>=0.4.1->tensorboard<2.11,>=2.10->tensorflow~=2.10.0->tf-models-official>=2.5.1->object-detection==0.1) (3.2.2)
Building wheels for collected packages: object-detection
Building wheel for object-detection (setup.py): started
Building wheel for object-detection (setup.py): finished with status 'done'
Created wheel for object-detection: filename=object_detection-0.1-py3-none-any.whl size=1666274 sha256=dd41be69fbe91af582f21371b566cc0a8ca38fda7d58dba061029b5fcb433f9a
Stored in directory: C:\Users\thril\AppData\Local\Temp\pip-ephem-wheel-cache-9f7k37mw\wheels\49\99\11\aec3644786db93291ddd1f3d541713ed65a513a003eb0a3598
Successfully built object-detection
Installing collected packages: protobuf, orjson, opencv-python-headless, opencv-python, objsize, importlib_resources, immutabledict, httplib2, fasteners, fastavro, etils, dill, cloudpickle, avro-python3, tensorflow-addons, sacrebleu, proto-plus, oauth2client, kaggle, hdfs, googleapis-common-protos, tensorflow-metadata, seqeval, lvis, google-auth-httplib2, google-api-core, apache-beam, tensorflow-datasets, google-api-python-client, tensorflow-text, tf-models-official, object-detection
Attempting uninstall: protobuf
Found existing installation: protobuf 3.20.1
Uninstalling protobuf-3.20.1:
Successfully uninstalled protobuf-3.20.1
Rolling back uninstall of protobuf
Moving to c:\users\thril\anaconda3\lib\site-packages\protobuf-3.20.1-py3.9.egg-info
from C:\Users\thril\anaconda3\Lib\site-packages\~rotobuf-3.20.1-py3.9.egg-info
C:\Users\thril\models\research>
The system cannot find the path specified. ERROR: Could not install packages due to an OSError: [WinError 5] Access is denied: 'C:\\Users\\thril\\anaconda3\\Lib\\site-packages\\google\\protobuf\\internal\\_api_implementation.cp39-win_amd64.pyd' Consider using the `--user` option or check the permissions.
#pwd
#Run !cd.. 2 times after importing the models from the above code.
#This will point the project to the earlier project execution directory
#Otherwise we cannot execute the commands as the working directory is changed earlier
!cd..
# patch tf1 into `utils.ops`
utils_ops.tensorflow = tf.compat.v1
# Patch the location of gfile
tf.gfile = tf.io.gfile
import tarfile
with tarfile.open('faster_rcnn_nas_coco_2018_01_28.tar.gz') as tgf:
tgf.extractall()
tgf.close()
def load_model(model_name):
model_dir = pathlib.Path(model_name)/"saved_model"
model = tf.saved_model.load(str(model_dir))
model = model.signatures['serving_default']
return model
model_name = 'faster_rcnn_nas_coco_2018_01_28'
detection_model = load_model(model_name)
INFO:tensorflow:Saver not created because there are no variables in the graph to restore
# List of the strings that is used to add correct label for each box.
PATH_TO_LABELS = "labels.pbtxt"
print(PATH_TO_LABELS)
category_index = label_map_util.create_category_index_from_labelmap(PATH_TO_LABELS, use_display_name=True)
labels.pbtxt
detection_model.inputs
[<tf.Tensor 'image_tensor:0' shape=(None, None, None, 3) dtype=uint8>]
detection_model.output_shapes
{'detection_scores': TensorShape([None, 100]),
'num_detections': TensorShape([None]),
'detection_classes': TensorShape([None, 100]),
'detection_boxes': TensorShape([None, 100, 4])}
def run_inference_for_single_image(model, image):
image = np.asarray(image)
rgb = np.stack([image]*3,axis=2) # to convert into RGB
input_tensor = tf.convert_to_tensor(rgb)
input_tensor_new = input_tensor[tf.newaxis,...]
# Run inference
output_dict = detection_model(input_tensor_new)
num_detections = int(output_dict.pop('num_detections'))
output_dict = {key:value[0, :num_detections].numpy()
for key,value in output_dict.items()}
output_dict['num_detections'] = num_detections
# detection_classes should be ints.
output_dict['detection_classes'] = output_dict['detection_classes'].astype(np.int64)
print("output_dict[detection_classes]",output_dict['detection_classes'])
# Handle models with masks:
if 'detection_masks' in output_dict:
print("inside if statement")
# Reframe the the bbox mask to the image size.
detection_masks_reframed = utils_ops.reframe_box_masks_to_image_masks(output_dict['detection_masks'],
output_dict['detection_boxes'],image.shape[0], image.shape[1])
detection_masks_reframed = tf.cast(detection_masks_reframed > 0.5, tf.uint8)
output_dict['detection_masks_reframed'] = detection_masks_reframed.numpy()
return output_dict
def show_inference(model, image_path):
# the array based representation of the image will be used later in order to prepare the result image with boxes and labels on it.
image_np = np.array(Image.open(image_path))
# Actual detection.
output_dict = run_inference_for_single_image(model, image_np)
image_np1 = np.stack([image_np]*3,axis=2) # to convert into RGB
# Visualization of the results of a detection.
vis_util.visualize_boxes_and_labels_on_image_array(
image_np1,
output_dict['detection_boxes'],
output_dict['detection_classes'],
output_dict['detection_scores'],
category_index,
# instance_masks=output_dict.get('detection_masks_reframed', None),
use_normalized_coordinates=True,
line_thickness=8)
fig, a = plt.subplots(1,1)
fig.set_size_inches(3,3)
imS = cv2.resize(image_np1, (224, 224)) # Resize image
plt.imshow(imS)
fig.tight_layout()
plt.show()
plt.figure().clear()
plt.close()
PATH_TO_TEST_IMAGES_DIR = pathlib.Path("Testing_Images")
TEST_IMAGE_PATHS = sorted(list(PATH_TO_TEST_IMAGES_DIR.glob("*.jpg")))
TEST_IMAGE_PATHS
[Path('Testing_Images/00d7c36e-3cdf-4df6-ac03-6c30cdc8e85b.jpg'),
Path('Testing_Images/0100515c-5204-4f31-98e0-f35e4b00004a.jpg'),
Path('Testing_Images/016fd98f-09a8-4b99-a8df-c3c26992f22e.jpg'),
Path('Testing_Images/02743848-e50a-4faf-a3f4-a6215613a23d.jpg'),
Path('Testing_Images/02cfa7d1-61e0-4d9e-8664-9a7469ccab76.jpg'),
Path('Testing_Images/02de2d68-b7f9-428b-ac12-0cb8f56a0145.jpg'),
Path('Testing_Images/02ea6b76-4189-47c5-a8bf-70590f84ebd8.jpg'),
Path('Testing_Images/03d92597-3e33-4fdf-8db5-a27cf5b8d3eb.jpg'),
Path('Testing_Images/03edb5ed-9e76-4abe-bc35-7bc95fea7e6a.jpg'),
Path('Testing_Images/04558d1a-1f4d-47aa-bf07-a6f6595f9cec.jpg'),
Path('Testing_Images/04a6aa6d-bd6d-48b1-822b-21b81c4bdf3a.jpg'),
Path('Testing_Images/04dd9347-6215-4b34-a98f-a73e577aff9d.jpg'),
Path('Testing_Images/04e9f692-f3d6-496b-ae0c-905137cc1f84.jpg'),
Path('Testing_Images/04f9bcc7-8886-4e1a-a938-122d3a065e12.jpg'),
Path('Testing_Images/050068ca-78f7-4394-b8de-6ebe347192e9.jpg'),
Path('Testing_Images/051c2709-fbbf-408f-aa19-c869e16a79e2.jpg'),
Path('Testing_Images/054eb541-ccd9-4e3c-a654-dc9855ac8e47.jpg'),
Path('Testing_Images/0567d70b-5a94-4363-a162-7207ffaff023.jpg'),
Path('Testing_Images/05744f1d-2fa5-41ce-85d3-3df6a7302064.jpg'),
Path('Testing_Images/059f1e59-573c-49d1-ac78-3323a2ff047f.jpg'),
Path('Testing_Images/05bb6f8d-453b-4e56-ae92-2600e058ba65.jpg'),
Path('Testing_Images/05c8ec6f-c41d-4dbe-a6a0-2875334b0b9e.jpg'),
Path('Testing_Images/05dad446-45f7-44df-bd2f-4673d9502348.jpg'),
Path('Testing_Images/06893030-14ab-456d-a22d-05070773e370.jpg'),
Path('Testing_Images/068c876a-d9ee-4a1e-810a-63c8195b27bd.jpg'),
Path('Testing_Images/06d3f696-82ea-4c71-b162-f18b3e0eeeac.jpg'),
Path('Testing_Images/06d5a58d-baf1-4937-bfc3-00db1fb2b1be.jpg'),
Path('Testing_Images/0745b403-9006-4534-b28f-0e19ba3ff8e2.jpg'),
Path('Testing_Images/07bb7f0d-ef4b-4963-b453-7f9d8c5b7c62.jpg'),
Path('Testing_Images/07c6955b-57c1-4ba8-8b07-81b333624ed9.jpg'),
Path('Testing_Images/0812b373-eef7-47a9-a942-0850b004eb1e.jpg'),
Path('Testing_Images/0824dce4-e0fa-4129-adc8-224c6afa6bc2.jpg'),
Path('Testing_Images/082fb468-30f1-43c1-ac04-62d16fdd883b.jpg'),
Path('Testing_Images/084aa98a-91aa-45d8-aa52-4fad8344b0bf.jpg'),
Path('Testing_Images/085a25eb-1ebd-4bf1-b320-7833e7c37553.jpg'),
Path('Testing_Images/08ac2f29-e8b4-484e-9394-806ef7bee8ea.jpg'),
Path('Testing_Images/091cc2b7-8ba6-4fce-8ae2-7547117dbddf.jpg'),
Path('Testing_Images/0927e819-9640-487d-a04e-f5d4732dbe0a.jpg'),
Path('Testing_Images/096a447e-8908-432b-a713-a51d3cd2ae9e.jpg'),
Path('Testing_Images/098e14d4-3205-4c2d-a059-738f830c0aa5.jpg'),
Path('Testing_Images/09b2d54e-3a85-4efd-8204-ea73bc70c405.jpg'),
Path('Testing_Images/09d21970-3207-4817-ab35-b26bbfdead9e.jpg'),
Path('Testing_Images/09e68f8e-f07a-4240-93f0-ce1969b46a80.jpg'),
Path('Testing_Images/09eb4d2d-ebb1-4eb7-bfbb-038ffb7100ca.jpg'),
Path('Testing_Images/0a6a5956-58cf-4f17-9e39-7e0d17310f67.jpg'),
Path('Testing_Images/0a825a59-c034-481c-b596-9c7f0bc42c2c.jpg'),
Path('Testing_Images/0aa43663-a1f3-44cb-8ecc-9b36fbb0d778.jpg'),
Path('Testing_Images/0acc499c-802c-47e8-a297-87ea7d971462.jpg'),
Path('Testing_Images/0af9ad69-4d76-4de9-b93a-294478a97866.jpg'),
Path('Testing_Images/0b510189-2f09-4273-89fe-4ce4581ae69a.jpg'),
Path('Testing_Images/0b569b4c-107f-4bf4-9166-c4f51a9f881a.jpg'),
Path('Testing_Images/0beac0c9-331a-4aad-9cec-099aff98c399.jpg'),
Path('Testing_Images/0c17944f-c5d6-4d88-9b77-a5f86d22447b.jpg'),
Path('Testing_Images/0c9be296-285f-44dc-8599-afde00848e9d.jpg'),
Path('Testing_Images/0cac04c6-c4b6-48e5-9b28-b0a6c44289a5.jpg'),
Path('Testing_Images/0cbc601f-91f0-4f86-b780-ffeac24471c7.jpg'),
Path('Testing_Images/0e2abdaa-d654-4824-abaa-9d6fe5b67a95.jpg'),
Path('Testing_Images/0e7d0fc9-9a11-4325-91b2-bb13e40d1f1a.jpg'),
Path('Testing_Images/0e9f4af5-64ae-41aa-b331-e012e28caac4.jpg'),
Path('Testing_Images/0ebc8268-df3d-45d8-8ee7-b34880c62830.jpg'),
Path('Testing_Images/0ecd1c28-bf25-4d32-979c-448356e0edb4.jpg'),
Path('Testing_Images/0ed69049-8906-4b90-8841-727a7716c42b.jpg'),
Path('Testing_Images/0edcf2dd-71fe-4108-9303-24d51b4faf72.jpg'),
Path('Testing_Images/0f1a4a99-3a8a-4d4b-be2c-8a003274ed53.jpg'),
Path('Testing_Images/0ff01216-ce5b-44bd-a857-bc22dc9f8f3c.jpg'),
Path('Testing_Images/1015e103-bda6-4664-978d-f761cd2982cb.jpg'),
Path('Testing_Images/103451bb-1f8c-4209-abce-0be7e9a6c8fb.jpg'),
Path('Testing_Images/1101cedc-35e2-483c-9326-1f3232b88f06.jpg'),
Path('Testing_Images/122bd0b8-4b45-4587-8cb9-1848e52aa863.jpg'),
Path('Testing_Images/129a1251-95cf-4663-91ad-4e58bc0b2d08.jpg'),
Path('Testing_Images/1321f38d-adb7-491d-97eb-277d57ab6e59.jpg'),
Path('Testing_Images/133a1f56-1ab0-4df7-8e1b-4a6293934f3a.jpg'),
Path('Testing_Images/13ef2d8e-3b2e-4fbb-9f70-69139dd74401.jpg'),
Path('Testing_Images/14302d4e-57f0-481b-a5a4-2305eebb0bb6.jpg'),
Path('Testing_Images/14cc479c-1385-46a1-9559-638c7da6f02d.jpg'),
Path('Testing_Images/1565fd6c-9432-454a-87e4-937fd5ad5b60.jpg'),
Path('Testing_Images/15c262ab-4ec6-4b19-8172-3aaf662e05c3.jpg'),
Path('Testing_Images/15f173c9-43b4-43b7-9dde-ff87543b959f.jpg'),
Path('Testing_Images/161cacb1-dcdd-45f7-bfb6-3281fe82203d.jpg'),
Path('Testing_Images/1643d9ee-7f12-4a05-9328-f7d775f6cd2c.jpg'),
Path('Testing_Images/165e4595-deeb-4910-b06b-89e2fba035f6.jpg'),
Path('Testing_Images/1682a002-e834-4696-8101-70afe4edeebf.jpg'),
Path('Testing_Images/169d225c-4691-4897-a382-8aac28b0c348.jpg'),
Path('Testing_Images/173da42f-917c-4e33-9c95-71132361599c.jpg'),
Path('Testing_Images/177423d8-e811-4fd7-bfba-6b77b479e466.jpg'),
Path('Testing_Images/179c5fc6-32a7-4144-91e9-58c2371616d7.jpg'),
Path('Testing_Images/17a1427c-70b0-4383-b5f3-a2b5f1ec0691.jpg'),
Path('Testing_Images/17ad2599-0077-4b6d-b9dc-e80efcb9c815.jpg'),
Path('Testing_Images/17c749dc-e97a-4c5c-b4f4-c68b135a1a64.jpg'),
Path('Testing_Images/17c89450-d10e-4041-afa0-4c5a39adccdd.jpg'),
Path('Testing_Images/17ec1772-41a6-4bd6-a2a0-b9d45599eb28.jpg'),
Path('Testing_Images/17f68ac3-2385-4b13-b8c0-66ba8b343205.jpg'),
Path('Testing_Images/184f1560-cb67-4361-88e5-17a3d8744b2d.jpg'),
Path('Testing_Images/185d494c-3ec2-4317-bb8a-bcd0f9ce7615.jpg'),
Path('Testing_Images/1871337b-8e0b-4329-93a4-b0d9afc5c814.jpg'),
Path('Testing_Images/191782c0-7007-44a3-a774-31524e20111d.jpg'),
Path('Testing_Images/192748b6-82f2-4f0d-a25d-cea1600c7246.jpg'),
Path('Testing_Images/192969b5-ad24-4dfc-922e-b5f22b880b35.jpg'),
Path('Testing_Images/193614a7-cca5-4108-a812-513722a6af08.jpg'),
Path('Testing_Images/193fcf6f-e8c1-4ddd-bf36-7abcc1b0cabf.jpg'),
Path('Testing_Images/1944c064-d08d-4173-9c7a-a777dcf5262c.jpg'),
Path('Testing_Images/1a062886-29be-4120-92f3-5c1c8a951959.jpg'),
Path('Testing_Images/1b2aefc5-a9dd-4f84-a314-3acec56df761.jpg'),
Path('Testing_Images/1be56c93-6607-4b08-b73a-dfe4ca81f7b1.jpg'),
Path('Testing_Images/1caa4dac-4bac-419b-91d4-cac2d8408ccd.jpg'),
Path('Testing_Images/1cee8f22-8603-49a8-a69c-ea807158db1b.jpg'),
Path('Testing_Images/1dbc0249-4761-4b19-9872-de59673942da.jpg'),
Path('Testing_Images/1e1d4b6e-7fe6-4ea7-91ea-9c42ec8d6f00.jpg'),
Path('Testing_Images/1f7a3519-99e6-4466-9859-5e628bebff73.jpg'),
Path('Testing_Images/200613d9-e461-4836-ae40-b349087e01a2.jpg'),
Path('Testing_Images/21e9c794-01b6-4193-86f5-b07f03323ad3.jpg'),
Path('Testing_Images/22a39d0b-3a66-400a-96cf-97a228a2de79.jpg'),
Path('Testing_Images/23ab8dcb-0331-42b0-8058-45676e48675e.jpg'),
Path('Testing_Images/23ec5265-20bb-4f80-9315-66ef0d7adef9.jpg'),
Path('Testing_Images/24a0f5b2-e762-4c07-ac23-b190fb0211ce.jpg'),
Path('Testing_Images/25300d70-86f1-4be6-83a9-f6ae31a0b451.jpg'),
Path('Testing_Images/2554df23-d5d3-4ffc-86c7-952a535998fc.jpg'),
Path('Testing_Images/25d3bbc4-580d-40d1-9b4f-b29bf9741b93.jpg'),
Path('Testing_Images/25e49cde-cdd9-47ba-9e3c-bc8406433240.jpg'),
Path('Testing_Images/2635357b-8865-4551-9ed8-3f90b996c982.jpg'),
Path('Testing_Images/26c15b0a-9865-414d-94b2-5349e8903f88.jpg'),
Path('Testing_Images/27205fb6-5f28-4fc5-a9d8-c6748b97f4f7.jpg'),
Path('Testing_Images/276b4fe8-3347-46e7-a48d-2fd36a6ab837.jpg'),
Path('Testing_Images/27775937-404e-4e1a-bef0-af5d63fc0b3b.jpg'),
Path('Testing_Images/28684a2f-1e6f-4431-969a-2f9cf574e7ba.jpg'),
Path('Testing_Images/296c307e-cf85-4ef4-95eb-7a05d4576dfa.jpg'),
Path('Testing_Images/2af1bc3b-fad3-4645-9171-791c1a6fc1ee.jpg'),
Path('Testing_Images/2b74be46-2b42-4baf-9c9c-cc253e766a56.jpg'),
Path('Testing_Images/2c575dd0-1a23-40d1-a0b6-da6d9c2556ae.jpg'),
Path('Testing_Images/2c68016e-5a63-4430-a834-efe5d43edd0e.jpg'),
Path('Testing_Images/2c781913-5f24-4b0e-959c-6b7385c6fce0.jpg'),
Path('Testing_Images/2cce4f3d-ad2a-4021-b46a-3f34015600e5.jpg'),
Path('Testing_Images/2d370cb1-791e-407e-9408-9156664de6b6.jpg'),
Path('Testing_Images/2dac040f-71c3-4bd3-8d4f-4320066e2d04.jpg'),
Path('Testing_Images/2db5d817-4f4c-4773-afa0-ed3cdb3a5035.jpg'),
Path('Testing_Images/2e12816f-48d5-4a6c-a8ac-c3c2ae7ba059.jpg'),
Path('Testing_Images/2e1dc497-36cf-4881-8166-5552783eb2ef.jpg'),
Path('Testing_Images/2eb0a283-bb6d-48ef-9c11-a0924f32e67b.jpg'),
Path('Testing_Images/2ec54800-bf48-43c5-aee9-18de34508ef4.jpg'),
Path('Testing_Images/2feb2bad-6b36-4067-87f1-54d39235e3c5.jpg'),
Path('Testing_Images/313a3547-7b54-4b60-af7f-2f53977681c2.jpg'),
Path('Testing_Images/3164b8ce-f84e-47db-9c14-e78890c955ed.jpg'),
Path('Testing_Images/3173b82f-569b-43e6-aecc-b4fb6a14466f.jpg'),
Path('Testing_Images/31764d54-ea3b-434f-bae2-8c579ed13799.jpg'),
Path('Testing_Images/319c7cc1-72c8-4b0d-96b2-bf6de8c26021.jpg'),
Path('Testing_Images/31a00524-b6eb-4d94-99ed-e70827499507.jpg'),
Path('Testing_Images/31f103c6-fcd1-4ba5-aae8-87b20e7b9546.jpg'),
Path('Testing_Images/31f4be2a-eb24-4816-8766-56ff59314cc9.jpg'),
Path('Testing_Images/320df0f4-4189-401a-a836-569ba4f8f389.jpg'),
Path('Testing_Images/32261a33-4fb0-4222-bd6a-29532bfbd7df.jpg'),
Path('Testing_Images/32408669-c137-4e8d-bd62-fe8345b40e73.jpg'),
Path('Testing_Images/3251dea8-4f74-4f4b-8f56-167b0213414b.jpg'),
Path('Testing_Images/329c254a-e6bc-44cf-bc09-365d5d43f4df.jpg'),
Path('Testing_Images/32c3e235-1885-4aad-b4ad-2ef8583fbb78.jpg'),
Path('Testing_Images/33382d82-31fd-4a74-950a-1820412aa6c5.jpg'),
Path('Testing_Images/335ee631-5d9a-464c-8fdd-fa759b5c9fe0.jpg'),
Path('Testing_Images/338e80a6-7b26-4133-b051-a40e063d39a3.jpg'),
Path('Testing_Images/3397d218-ad10-49dd-a3e0-9b1ee6d3d91d.jpg'),
Path('Testing_Images/33b759a7-cc42-4d7e-8f54-c1a32b389674.jpg'),
Path('Testing_Images/33df5e75-0b13-4c2b-b217-a0b32e3fc396.jpg'),
Path('Testing_Images/33edceb6-b8c3-4d4b-84af-f755b48f9cdf.jpg'),
Path('Testing_Images/33f85237-bcad-4dd9-8a03-42d061fd987f.jpg'),
Path('Testing_Images/342209eb-5fc1-46cf-83af-4beaee1a60b6.jpg'),
Path('Testing_Images/34411756-b5ea-47b6-bcb4-3dd037d95eba.jpg'),
Path('Testing_Images/345d4363-a4e6-47f4-97c8-9bbd22061ae9.jpg'),
Path('Testing_Images/345e6018-89b7-41d0-bd33-2350c1c60311.jpg'),
Path('Testing_Images/345e85a4-6f43-422c-b472-61e36c37faa2.jpg'),
Path('Testing_Images/3465b95f-b01b-4c97-97e0-e4986afa5bd8.jpg'),
Path('Testing_Images/347476f1-b3e1-4e7d-a029-e6da07d7e57c.jpg'),
Path('Testing_Images/34795b7d-4e44-4ebf-80b6-7c9317569664.jpg'),
Path('Testing_Images/34858b4b-37ff-4130-be8f-7075f3f3b056.jpg'),
Path('Testing_Images/348699f9-4eb2-42b7-b9e8-8bc3230b32a5.jpg'),
Path('Testing_Images/3488f56d-1c15-45f2-ad45-ee5b11196442.jpg'),
Path('Testing_Images/349c35ba-a32c-4cef-bfe9-613da6d7054d.jpg'),
Path('Testing_Images/34aca5a7-7786-4060-afea-cf31d8fc05d6.jpg'),
Path('Testing_Images/34d340bd-2928-41cb-8d2a-98ba84999a01.jpg'),
Path('Testing_Images/34fbff70-fa6e-4709-ab07-17f739fce394.jpg'),
Path('Testing_Images/34fdff09-5bc2-4df5-b8cf-3c37662037c8.jpg'),
Path('Testing_Images/353a3a8c-0b05-4f05-87d9-42d0b069d664.jpg'),
Path('Testing_Images/35669389-7137-41c8-9040-f6b4aaf7ec3f.jpg'),
Path('Testing_Images/3580d303-401f-4f0b-ac88-0729834349a8.jpg'),
Path('Testing_Images/3588145b-c0eb-49dc-87e5-56dda99847d3.jpg'),
Path('Testing_Images/358a25c3-ecbc-490b-b887-1594168f4dab.jpg'),
Path('Testing_Images/3594b48f-fcc1-4947-93f5-64be66467147.jpg'),
Path('Testing_Images/35e2f247-6930-4b7d-bc43-baaa622cce2c.jpg'),
Path('Testing_Images/35f287e7-12d7-41f4-a5e9-711086bd031d.jpg'),
Path('Testing_Images/35f9509f-a9d4-4d9c-8401-887063252793.jpg'),
Path('Testing_Images/36042724-581b-4459-a45a-603beec0bdf2.jpg'),
Path('Testing_Images/3606dcdc-56a2-405f-b799-2f7e13d1b2e8.jpg'),
Path('Testing_Images/363fb4cd-363a-42eb-8a44-7bc088a57f69.jpg'),
Path('Testing_Images/364b4a5e-baf6-40a8-ab77-a68b3aaec56f.jpg'),
Path('Testing_Images/36880ed5-467f-489a-bc53-bc399f314fb8.jpg'),
Path('Testing_Images/36c6e858-3acf-46df-8b0a-9b452885f85c.jpg'),
Path('Testing_Images/36ca4185-52e1-4cf3-b007-5c7680d2ae74.jpg'),
Path('Testing_Images/36deb0a8-4845-4610-88a1-5ba66039f00c.jpg'),
Path('Testing_Images/3707b351-5760-4962-93a0-9f58252d22e7.jpg'),
Path('Testing_Images/37296532-0632-40c6-ba5c-377e4d12cacd.jpg'),
Path('Testing_Images/3747cf8d-0059-4600-8015-e55b2313cefb.jpg'),
Path('Testing_Images/378ef52a-cbc7-4e92-bd6b-ecb4045508f3.jpg'),
Path('Testing_Images/37a40c9f-fa3d-4819-85c0-531d0ba6b991.jpg'),
Path('Testing_Images/37c1ff63-3e09-4419-8b98-3a981ea916d9.jpg'),
Path('Testing_Images/3811c08f-3c21-4327-bb92-b57932c66398.jpg'),
Path('Testing_Images/381e4a17-cf7a-4d8d-8937-5bb3e123056f.jpg'),
Path('Testing_Images/38395193-8ed1-4d49-9f3e-4d9a2418eaed.jpg'),
Path('Testing_Images/3862449a-cc8b-40da-91a0-a1437618e65c.jpg'),
Path('Testing_Images/38a60ce9-5adc-4875-9fb6-05ea5d44b557.jpg'),
Path('Testing_Images/3902eb7b-6c41-4eed-910f-5eb03f58c50d.jpg'),
Path('Testing_Images/394e0236-0dbc-47a4-8da7-0e4e5311f690.jpg'),
Path('Testing_Images/39552f5b-ee51-4a9a-9028-8fcdaae9baac.jpg'),
Path('Testing_Images/3a010152-9ce2-4780-bdd8-bac6fa101991.jpg'),
Path('Testing_Images/3a81e5cd-9c7b-4761-a58b-f013d0747822.jpg'),
Path('Testing_Images/3abb7176-035d-46cc-844e-820870e8154b.jpg'),
Path('Testing_Images/3ac49fd1-c580-48f1-8e3c-6a50d3447ba7.jpg'),
Path('Testing_Images/3ac4d041-0c6b-4ca8-9765-3f7b227bd303.jpg'),
Path('Testing_Images/3ae26ce0-03ab-4963-8ab5-b87b55d3982c.jpg'),
Path('Testing_Images/3af465f7-7be0-4b58-969b-943499e00946.jpg'),
Path('Testing_Images/3b0df968-c8e2-4c0d-8401-f42b104850a2.jpg'),
Path('Testing_Images/3b253e33-a932-4d96-a8fe-f3a5be6b66a6.jpg'),
Path('Testing_Images/3b31b14d-5c0e-4a47-983f-98099dd9f991.jpg'),
Path('Testing_Images/3b4decc7-fed9-46fd-9d30-179b7cde4a2c.jpg'),
Path('Testing_Images/3b9bcca1-0ca9-4e93-ba0c-0a6ae1342726.jpg'),
Path('Testing_Images/3bbcfb90-b950-4502-9866-82a131651a73.jpg'),
Path('Testing_Images/3bf038e0-a6df-45d4-9422-c9a1e1ddda49.jpg'),
Path('Testing_Images/3c072590-e9e8-4ac2-b9e7-dd1776e2a537.jpg'),
Path('Testing_Images/3c112c11-3e6a-4105-bcdf-78e146ca8862.jpg'),
Path('Testing_Images/3c625ae7-a036-4bf1-9984-257e233ed907.jpg'),
Path('Testing_Images/3c7d89e1-4fb0-4832-821e-5978aa5284df.jpg'),
Path('Testing_Images/3c87fabf-5d43-4d3e-8c5f-1e6eb2e82b73.jpg'),
Path('Testing_Images/3cb2bebe-0b95-42af-9bfe-72aeff44f745.jpg'),
Path('Testing_Images/3cdd4c46-2476-4203-9161-1bdfd56c4278.jpg'),
Path('Testing_Images/3cf1aff4-8d32-498b-a7fe-d2a6a5292d3c.jpg'),
Path('Testing_Images/3d1c3cd6-7e88-48ea-91b7-3150c2833bcf.jpg'),
Path('Testing_Images/3d30e71a-b167-472f-80d9-d5d0a814a8ad.jpg'),
Path('Testing_Images/3d643bc7-85ea-4684-88f6-4e2ee36ff46d.jpg'),
Path('Testing_Images/3db4f481-4e2e-4dd0-a191-a807ee22a39b.jpg'),
Path('Testing_Images/3dcbf0ea-0175-4f6a-94c3-5cea4380e5f8.jpg'),
Path('Testing_Images/3e07be0a-9693-4f9c-9295-9e72b3e2a872.jpg'),
Path('Testing_Images/3e28ee16-bbba-446c-a7fd-ea744954a21a.jpg'),
Path('Testing_Images/3e2db6da-3b64-4388-8098-1c4c037ec03a.jpg'),
Path('Testing_Images/3e5bfc24-2f0b-4d91-8fe8-923d27900567.jpg'),
Path('Testing_Images/3e62e51c-befb-4a66-b28e-fc469b19e4a8.jpg'),
Path('Testing_Images/3e6ff69e-0090-42c6-97b1-26434f52b19d.jpg'),
Path('Testing_Images/3e881f10-28aa-4626-a79a-50cc014b7a1a.jpg'),
Path('Testing_Images/3e9dc5b9-1b7a-4e9f-a8aa-d28fc622f595.jpg'),
Path('Testing_Images/3ea07f33-3cd8-4a27-b680-8e1bc57aa1b8.jpg'),
Path('Testing_Images/3eab8ad4-5d91-4b53-a7f7-9ccc04100665.jpg'),
Path('Testing_Images/3ec0d348-9d0a-4f8a-8dfa-45e576ef62cc.jpg'),
Path('Testing_Images/3ed75b04-2a32-44a8-9297-0e58441f0e8c.jpg'),
Path('Testing_Images/3f20c5bb-e534-42fe-b3f7-b6c3c4ec01ed.jpg'),
Path('Testing_Images/3f375c9a-9059-4886-9275-165281dd41b0.jpg'),
Path('Testing_Images/3f4eae54-fb2b-4616-a32f-c42f7df570b7.jpg'),
Path('Testing_Images/3f9d98ed-1a89-436c-a528-8d34b10a8530.jpg'),
Path('Testing_Images/3fd5687e-bc16-4802-a44e-163f667882e4.jpg'),
Path('Testing_Images/400a53d8-22e5-40d1-a846-be334bc5c363.jpg'),
Path('Testing_Images/400f9de2-ee62-4c18-8d8b-7c122b688ecf.jpg'),
Path('Testing_Images/401b69a0-3d07-47f8-bfea-4bc30aa51403.jpg'),
Path('Testing_Images/404397af-558a-4175-8beb-c594079e2e74.jpg'),
Path('Testing_Images/405599ce-75dc-422c-b058-08022f60611a.jpg'),
Path('Testing_Images/405fc22f-ea3f-481e-bb8e-b6b75ce9b916.jpg'),
Path('Testing_Images/40a41f0b-a687-43c8-bed0-1dc5a83eec71.jpg'),
Path('Testing_Images/40a62ccf-a9ec-46a2-a349-f72e4f8d4488.jpg'),
Path('Testing_Images/40c66f83-da90-4316-b53e-ded82c4b8c5a.jpg'),
Path('Testing_Images/40dbceff-201d-4889-9aa1-2064246fb6b8.jpg'),
Path('Testing_Images/40e894fe-57f0-48b9-8d7e-0898685827bd.jpg'),
Path('Testing_Images/41259843-2e8f-41fb-9d3e-b894e54e37f9.jpg'),
Path('Testing_Images/415f5493-d330-4239-b25c-cda4ef422683.jpg'),
Path('Testing_Images/417b4ee2-27b9-45ac-9a3d-0fa726b3ad92.jpg'),
Path('Testing_Images/41966d18-052b-4335-9e27-2a100ef5b4f1.jpg'),
Path('Testing_Images/41df6d1a-a499-4032-8115-216443dbdc2b.jpg'),
Path('Testing_Images/422d598c-f96e-40e0-abea-f6db2f9945ec.jpg'),
Path('Testing_Images/42900731-1044-4fca-9641-96c7dd02625c.jpg'),
Path('Testing_Images/437d5164-86fa-4a55-b130-6a4b2e1ffe5e.jpg'),
Path('Testing_Images/44476603-8ea4-4a46-83a5-a4bc90649282.jpg'),
Path('Testing_Images/448df823-eecf-4412-9f1a-5f70e84d203f.jpg'),
Path('Testing_Images/4522d490-b6c8-42f3-907d-d7182a790487.jpg'),
Path('Testing_Images/45415872-fc2a-43ba-869d-5131c57622f1.jpg'),
Path('Testing_Images/455f89b9-3245-4c41-87e1-a867ff4622fa.jpg'),
Path('Testing_Images/457491c0-de6d-4bf1-9d74-1f9ea46b563d.jpg'),
Path('Testing_Images/45877324-19f1-49c8-90f2-037d0a6ebfae.jpg'),
Path('Testing_Images/4603cccb-c783-44e6-8c38-df7210e1c85a.jpg'),
Path('Testing_Images/4637f5df-19c7-406d-829a-aafa747821ec.jpg'),
Path('Testing_Images/4648165d-da7c-4cd6-ba52-92bf16119777.jpg'),
Path('Testing_Images/464d7389-cbe4-46df-a2fe-b5c9bfca664e.jpg'),
Path('Testing_Images/467e09de-b9b2-4d84-81af-c7b88314b08c.jpg'),
Path('Testing_Images/46c06db4-948e-4495-a9a1-d4f945a0fb6e.jpg'),
Path('Testing_Images/46e3db27-f943-4322-b7df-c960c6f0e915.jpg'),
Path('Testing_Images/47b6aa0b-bd34-42a0-a0ad-469a1c6e3e1e.jpg'),
Path('Testing_Images/47ea6a1c-63ce-4785-b103-f4fe7c3f754a.jpg'),
Path('Testing_Images/47ec1b70-a18d-4c98-a313-4999a4059742.jpg'),
Path('Testing_Images/481e364a-3cd1-4188-abc1-33c5816023f6.jpg'),
Path('Testing_Images/481ff753-4106-405c-962e-a3f1c1e3bc37.jpg'),
Path('Testing_Images/482986dd-d09a-4850-94d9-3a13e8d877e7.jpg'),
Path('Testing_Images/485c7e13-e958-458a-9af1-1b4f0f0e11f8.jpg'),
Path('Testing_Images/48b71876-62c7-44c0-91b1-0952bbca6291.jpg'),
Path('Testing_Images/48defac0-fab6-4c3f-9d1a-0ba441d599ec.jpg'),
Path('Testing_Images/48e50289-c648-476d-9bcc-da7d51270702.jpg'),
Path('Testing_Images/48e79fd1-8194-47f1-9e54-d26af27cb1b0.jpg'),
Path('Testing_Images/494b5858-a56a-4016-a4ab-be74c50128f5.jpg'),
Path('Testing_Images/49566c7b-edf4-45ca-96d9-fbc3afa97477.jpg'),
Path('Testing_Images/497715df-fa64-48e6-9cc6-dcc6cfd32c39.jpg'),
Path('Testing_Images/4a04893d-e68e-46eb-956d-1c92b6ee27a4.jpg'),
Path('Testing_Images/4a39e459-50fd-410c-a418-a0d5197158c0.jpg'),
Path('Testing_Images/4a53ac3d-7277-4400-9e6b-55e83fe6d54f.jpg'),
Path('Testing_Images/4a9f80ea-ca61-4763-afe9-76b7e0dd7560.jpg'),
Path('Testing_Images/4b22e87f-d0b0-43d6-a8e6-97a27cf0ee9e.jpg'),
Path('Testing_Images/4bf42abe-22fa-4c93-919b-cff5ed49bc87.jpg'),
Path('Testing_Images/4c4124e0-1584-49a0-97e5-76c708f072fb.jpg'),
Path('Testing_Images/4c43612d-dc22-473d-8194-6efa0110b39e.jpg'),
Path('Testing_Images/4cfb1452-b5dd-4b81-a890-8af9d3788de7.jpg'),
Path('Testing_Images/4d476ccd-5814-45a6-85f2-e419c9881065.jpg'),
Path('Testing_Images/4d95fe98-3899-4bf4-8d54-c0c2ca641201.jpg'),
Path('Testing_Images/4dab56c9-09db-47fb-88f3-0989cadc5c66.jpg'),
Path('Testing_Images/4dbc44ca-694a-4814-9a24-bc60c8d8d611.jpg'),
Path('Testing_Images/4e26d7a9-e0ee-4116-a887-d1fca71d7372.jpg'),
Path('Testing_Images/4e27dfb7-0951-4b29-9851-b28d24f9b9f3.jpg'),
Path('Testing_Images/4f33e113-e2d1-4122-8651-e937da208edf.jpg'),
Path('Testing_Images/4f751af2-085d-4ba2-b794-09bbc67ec337.jpg'),
Path('Testing_Images/4f7604aa-98ef-43c5-a0f4-6b4b043bd7dd.jpg'),
Path('Testing_Images/4f93c4a5-c309-4c1e-b71a-58c2f3b38353.jpg'),
Path('Testing_Images/4f9a7303-4d5d-45fc-bcc4-d3f50f03464e.jpg'),
Path('Testing_Images/4fa69712-d57b-4ea9-bdf3-40cdc03f02ff.jpg'),
Path('Testing_Images/4fd01523-eaf3-4003-aae2-92ed7f7fac45.jpg'),
Path('Testing_Images/50dcabc3-7ef6-4965-9a85-e15aa321c482.jpg'),
Path('Testing_Images/50f59689-fcc2-400c-8d9f-2a57df5c56ea.jpg'),
Path('Testing_Images/5192122c-e2bd-46e8-8837-7dccf497dc54.jpg'),
Path('Testing_Images/51fbcba7-7bdb-49de-818c-066ec4ff155f.jpg'),
Path('Testing_Images/521c3423-5ab5-4f57-b84b-f0f709fdae3d.jpg'),
Path('Testing_Images/52debd65-65c9-4a26-bbfd-eac603f41e17.jpg'),
Path('Testing_Images/53112e45-39d1-4075-b668-e20d5228a9e0.jpg'),
Path('Testing_Images/531722e7-d083-4c1d-b639-49c537ad758e.jpg'),
Path('Testing_Images/53645011-e495-4bc3-a3fa-087aa3cde971.jpg'),
Path('Testing_Images/54539347-4e0a-4b42-af9a-420d5f0c95e7.jpg'),
Path('Testing_Images/54a73e25-c2ce-4109-87c1-2eac19cdc2fd.jpg'),
Path('Testing_Images/54adda49-e4a2-4706-af0c-40174ba821a6.jpg'),
Path('Testing_Images/54bee6da-4ad1-4351-a0f7-ed1d9461d566.jpg'),
Path('Testing_Images/55356b68-9219-4130-94a4-b3827f5cd0e1.jpg'),
Path('Testing_Images/553e5581-e9ac-44da-9a81-a95ed7320afd.jpg'),
Path('Testing_Images/55507de4-9a27-431d-a16f-3a8b9a5766a4.jpg'),
Path('Testing_Images/5557e72e-07cc-4279-8912-2c319937c4f7.jpg'),
Path('Testing_Images/557ab975-0426-4746-a88f-872ef5675134.jpg'),
Path('Testing_Images/55bf30ae-819a-4d20-816c-d45615df0688.jpg'),
Path('Testing_Images/56130832-8cb2-4c91-a98e-a40b1f1d1dd2.jpg'),
Path('Testing_Images/5673f2ae-da70-4301-b2ba-dfb6e9f84085.jpg'),
Path('Testing_Images/5675fd06-eb6c-49a2-b3a3-0df515e9a266.jpg'),
Path('Testing_Images/56789bdb-728f-4e47-95ec-e008f08e0b41.jpg'),
Path('Testing_Images/567dee1f-ac03-4f7c-b944-d65b58c575a9.jpg'),
Path('Testing_Images/56b278f2-ab29-48b5-9602-b7ec496885df.jpg'),
Path('Testing_Images/56b5c0cf-8a3c-42f2-b3e2-adb2f550c451.jpg'),
Path('Testing_Images/56cf72bd-877d-4efc-9464-c2991b225820.jpg'),
Path('Testing_Images/57906d1b-d8dc-473e-8be1-c0e11e217054.jpg'),
Path('Testing_Images/57b7c911-a726-457c-b588-0a1ffdf0ad5f.jpg'),
Path('Testing_Images/57e70848-0cce-456d-90b1-e3f0a206d8c1.jpg'),
Path('Testing_Images/58033c30-f751-4d64-91d0-c8074851b72e.jpg'),
Path('Testing_Images/586a5f15-fbd3-4e13-ab3b-6df803e6a031.jpg'),
Path('Testing_Images/595251bf-1704-4be6-b8fb-a69123163f17.jpg'),
Path('Testing_Images/5973a2ae-a2f9-404e-930d-0d7c96f756ef.jpg'),
Path('Testing_Images/5976961d-69f3-456c-a61d-66537789c550.jpg'),
Path('Testing_Images/59ffc415-e14b-47b2-86f1-07e41fee2846.jpg'),
Path('Testing_Images/5a072ffe-ed90-4516-803f-30641dfa9497.jpg'),
Path('Testing_Images/5a133587-a3b4-4515-be3e-fd0ee5a39f7a.jpg'),
Path('Testing_Images/5a17c6cf-5a46-4fb6-906c-4ac89f532fa5.jpg'),
Path('Testing_Images/5a1edc6c-b8a9-4b7f-870a-4e5f867cca33.jpg'),
Path('Testing_Images/5a2efed9-8132-4de5-bf6f-c73c7d014530.jpg'),
Path('Testing_Images/5a678523-c339-4b21-9cab-02c304308023.jpg'),
Path('Testing_Images/5ae077c7-0d89-49f4-b32d-626adaceada6.jpg'),
Path('Testing_Images/5ae70f1f-2b80-409e-8347-993f58185f3b.jpg'),
Path('Testing_Images/5b1172cd-40bd-4ac1-a802-82bda5579efa.jpg'),
Path('Testing_Images/5b19a5a1-d627-495e-80dd-c367f34a448c.jpg'),
Path('Testing_Images/5b35df82-9440-4634-9a28-406e1819cd4e.jpg'),
Path('Testing_Images/5b9d346b-8f6e-4ebf-8865-0c1ff9baac49.jpg'),
Path('Testing_Images/5bda9372-b868-4488-8e0f-6422bbd9dcf0.jpg'),
Path('Testing_Images/5d5ca429-74d2-44f9-997f-e28c804e4f81.jpg'),
Path('Testing_Images/5ddf194a-9266-4248-90a2-ddf0573aa569.jpg'),
Path('Testing_Images/5df1e543-92ca-42f2-94a8-7446f944a1ba.jpg'),
Path('Testing_Images/5e01e86a-d042-4644-a101-359fc9c8f59e.jpg'),
Path('Testing_Images/5e25afee-3357-47e0-930c-9415906d57d9.jpg'),
Path('Testing_Images/5e30641a-d90c-47f8-a798-76155850846b.jpg'),
Path('Testing_Images/5e9ad61b-0f2f-4942-b09d-b27844af8d0e.jpg'),
Path('Testing_Images/5eb932e2-3455-40fe-93db-ae44f897d9e0.jpg'),
Path('Testing_Images/5ed1cd5c-727a-4952-8ac0-52992df725f2.jpg'),
Path('Testing_Images/5f04038d-8c7a-455a-a2ac-8582102d7227.jpg'),
Path('Testing_Images/5f17bdef-37c1-447c-a41e-43ac2bf464df.jpg'),
Path('Testing_Images/5f1d47f7-b5e6-45fd-9393-fbfafb495c4f.jpg'),
Path('Testing_Images/604e0958-831f-40cd-87ff-08fe60c9890d.jpg'),
Path('Testing_Images/605614b9-394a-43b5-badc-e520df208c75.jpg'),
Path('Testing_Images/6088923d-3eaf-4988-8d8c-6a6012c4b376.jpg'),
Path('Testing_Images/612892af-2ac4-4da7-befb-c9bd18be97a2.jpg'),
Path('Testing_Images/6160ea7b-2653-4a89-be6a-02cfbcc651f3.jpg'),
Path('Testing_Images/61b32ca0-a2f1-40db-a18f-0bbfaed05a09.jpg'),
Path('Testing_Images/61cdcb62-7c4c-47dd-b366-44d4cf22e586.jpg'),
Path('Testing_Images/6226b4c2-5a8d-45a2-8a7a-f9b0160c625f.jpg'),
Path('Testing_Images/6231f6a8-7933-4f92-8981-a301dca04a57.jpg'),
Path('Testing_Images/625f6a52-1eab-46e5-a8de-711dcc21a51a.jpg'),
Path('Testing_Images/627567ae-907d-4d83-907d-79db88f48303.jpg'),
Path('Testing_Images/631f9025-e04f-47c4-8db1-de7ddf0c8b60.jpg'),
Path('Testing_Images/633192a2-d31a-42ed-993d-0d84c25d6eb9.jpg'),
Path('Testing_Images/6366d2b6-c1a7-40da-998d-981aa684a7f6.jpg'),
Path('Testing_Images/637b1dfe-c876-44d8-b5a9-e5ebe3663773.jpg'),
Path('Testing_Images/639b858a-eaf2-4605-9a90-25a07876421e.jpg'),
Path('Testing_Images/6438d645-6fdb-4fda-8598-a4c1902a33c2.jpg'),
Path('Testing_Images/643b5c88-3f21-4d40-90cb-bc0caa562e18.jpg'),
Path('Testing_Images/644662c0-c18a-4f1e-bccf-f0b222d99f59.jpg'),
Path('Testing_Images/6477a41c-821b-4c21-88cf-677f73d8bcd4.jpg'),
Path('Testing_Images/647b7bbc-5cfb-4776-b157-838a4264d318.jpg'),
Path('Testing_Images/648d7073-547d-4ce2-b91d-87436e5e7abf.jpg'),
Path('Testing_Images/64d4a708-eb80-4609-bcf8-38201b602e6a.jpg'),
Path('Testing_Images/64dd81eb-6c14-431b-ad5f-520297c62f97.jpg'),
Path('Testing_Images/6535af98-4b8c-46bd-8ed2-3afb6c811466.jpg'),
Path('Testing_Images/6542c163-7ccb-406f-8278-b27ca52ef90f.jpg'),
Path('Testing_Images/65f038e8-a727-472c-bb76-7a71cd4397f2.jpg'),
Path('Testing_Images/65f2119b-0e72-4f6e-9ec6-194f56b5b07d.jpg'),
Path('Testing_Images/666d06c9-eff3-4936-8673-f058961d392f.jpg'),
Path('Testing_Images/669f22f9-a27a-4aef-ba95-492fbe0dcb8f.jpg'),
Path('Testing_Images/66c52ba5-c91d-4f0c-af96-18148fd9dd66.jpg'),
Path('Testing_Images/66f69589-bf3b-4083-b226-79c68e030099.jpg'),
Path('Testing_Images/670080d5-d370-4c5a-b617-f5409d52a88d.jpg'),
Path('Testing_Images/670fe5a1-c998-46bb-83c0-e1925c6b3c71.jpg'),
Path('Testing_Images/6720f41b-840e-4c71-bbd2-838858ab2281.jpg'),
Path('Testing_Images/6735932c-c8ca-45d5-be56-1ff682a3182e.jpg'),
Path('Testing_Images/673a442a-9862-4b7b-83ef-9205c4c9dc1a.jpg'),
Path('Testing_Images/67666c79-cae5-48e8-be25-95c4960db15e.jpg'),
Path('Testing_Images/6766f51e-1b41-4320-b0dc-0c0cd97b6077.jpg'),
Path('Testing_Images/67e32074-a6a2-47eb-8ef0-f2d73ae775dd.jpg'),
Path('Testing_Images/680d309d-3aa5-4883-ae61-1b211286f5a4.jpg'),
Path('Testing_Images/68276cbf-0a10-439b-905a-6cefb1e2fdf3.jpg'),
Path('Testing_Images/6876141d-e5f2-4523-a0ae-c6d0a8f30169.jpg'),
Path('Testing_Images/68fce9ea-3122-49bd-8968-85e8f9315c83.jpg'),
Path('Testing_Images/69e20212-750e-482f-9ad0-1aed2911fa50.jpg'),
Path('Testing_Images/69ffd6ea-f6f2-408e-a314-eecbd3a3756b.jpg'),
Path('Testing_Images/6a552a6e-0b8e-4825-88ab-72c1da5ed42b.jpg'),
Path('Testing_Images/6b1c85b0-7acb-4d9c-b4c0-622d0025a088.jpg'),
Path('Testing_Images/6b2e1586-e6d6-4df6-89ce-06d8f4698cac.jpg'),
Path('Testing_Images/6b58d363-12fd-4194-81e2-d1637d678793.jpg'),
Path('Testing_Images/6b5ec6c5-06c3-4574-b44a-d5735bdb363b.jpg'),
Path('Testing_Images/6b969591-8a07-49ed-9f65-fa19a453df59.jpg'),
Path('Testing_Images/6bb76d7d-19b0-4f43-b5e8-7b050118a72a.jpg'),
Path('Testing_Images/6bbdbd28-5cd5-4f79-9128-b40e9e5d1fd5.jpg'),
Path('Testing_Images/6bf5b78a-635b-48f2-b767-460711927d7f.jpg'),
Path('Testing_Images/6c0341ef-950c-45c1-9009-777d893e8060.jpg'),
Path('Testing_Images/6c64fe3d-2ee7-4c33-80f1-7860773428e5.jpg'),
Path('Testing_Images/6c6dadf2-89ea-472c-9020-2c52f7f4cec4.jpg'),
Path('Testing_Images/6cda5173-edbe-4020-a64e-2a3b75b13283.jpg'),
Path('Testing_Images/6ce856cb-8c76-4ab6-ae18-d1afea549f94.jpg'),
Path('Testing_Images/6d9912ad-e56b-4ff3-931e-75b39df49c0f.jpg'),
Path('Testing_Images/6e028cf7-8ea4-4e7d-a2b5-0d5bd2eca495.jpg'),
Path('Testing_Images/6e66dd4c-d183-42c7-ad77-32abf6ca9760.jpg'),
Path('Testing_Images/6e802ad4-38a6-4517-9cac-bb14f3ae10e7.jpg'),
Path('Testing_Images/6e9329fc-862a-47dc-848f-fb560ed17096.jpg'),
Path('Testing_Images/6ed32587-0518-4b4b-95e6-a4b0651b1a6d.jpg'),
Path('Testing_Images/6eef0ab9-47c0-4b01-bd5d-7caa17593e87.jpg'),
Path('Testing_Images/6f2b836b-d150-45cd-ad9d-2ba6a6d4a508.jpg'),
Path('Testing_Images/6f5eb191-d7ee-472e-b08c-e6975896f5c4.jpg'),
Path('Testing_Images/6f84e6cf-b9ee-43d2-ab56-eae54adbcb88.jpg'),
Path('Testing_Images/704e5443-f7ed-4eae-9826-09a410d26c22.jpg'),
Path('Testing_Images/70559288-6edd-4228-85b6-c53776a754ed.jpg'),
Path('Testing_Images/705c3feb-6faa-442a-a321-f8b14d1d2a2a.jpg'),
Path('Testing_Images/7092a623-6d2d-4ecb-abed-be46296c4d7e.jpg'),
Path('Testing_Images/70a9e4e6-99a6-4f66-92e2-f1c30c714301.jpg'),
Path('Testing_Images/70c3efab-83bc-48e9-8b5e-6bb55ff5981b.jpg'),
Path('Testing_Images/70cb2306-fffb-4c0d-8385-3da27552f6f7.jpg'),
Path('Testing_Images/70cc15b4-37b2-4d2b-90ee-bdc3153cc0d5.jpg'),
Path('Testing_Images/70fd3f0c-f79f-4cbd-893f-c1d6b80b7af7.jpg'),
Path('Testing_Images/7106dad7-902b-4707-994e-f13826548929.jpg'),
Path('Testing_Images/71131596-6768-4832-a4b4-5b7a1478d060.jpg'),
Path('Testing_Images/7119d8ae-329e-4cf4-bfac-389ae146ec96.jpg'),
Path('Testing_Images/71231f52-c755-4f52-956f-eab77b9434f5.jpg'),
Path('Testing_Images/7123c0f8-b37f-4172-ae51-a32486ee3fae.jpg'),
Path('Testing_Images/714f9b26-0dee-401a-9f15-ae432961c820.jpg'),
Path('Testing_Images/71641993-53d7-4039-afd0-918e10567d2c.jpg'),
Path('Testing_Images/71665ab7-b4e9-4734-af2c-216a66706d5e.jpg'),
Path('Testing_Images/718d7420-83bf-48a2-9419-042046583ffc.jpg'),
Path('Testing_Images/71963cf6-6089-4a04-93c0-cdea6d3d33d4.jpg'),
Path('Testing_Images/7196b782-8132-48fa-bdcc-e67b393aa9a8.jpg'),
Path('Testing_Images/71c5e5fa-6d92-44a3-a9fa-1e3421b27a07.jpg'),
Path('Testing_Images/723515df-a2b9-4f89-bb3f-437e364bc726.jpg'),
Path('Testing_Images/72831387-4c1f-4398-a638-803ad559e2fd.jpg'),
Path('Testing_Images/729f2aa0-9564-4228-b516-1d8d4be8bb55.jpg'),
Path('Testing_Images/72a2d10b-cc88-433c-b988-8013f7530d99.jpg'),
Path('Testing_Images/72f79a49-6208-4253-9094-8081308caffb.jpg'),
Path('Testing_Images/7307c83f-63e1-46e1-9400-8d1eac241510.jpg'),
Path('Testing_Images/730b6928-fd01-4955-b4be-bca1e1e9306f.jpg'),
Path('Testing_Images/73176c86-28cb-425b-9744-27b76446aabf.jpg'),
Path('Testing_Images/734baa5e-73f6-4a71-bb4f-e1cfd988e04c.jpg'),
Path('Testing_Images/73827f0b-0bc9-47eb-8249-14621ad77f5a.jpg'),
Path('Testing_Images/738524e2-9e98-43cb-8c56-b480a2a0ae2f.jpg'),
Path('Testing_Images/73c72ba5-c349-4906-9f36-ac9e6f266125.jpg'),
Path('Testing_Images/74e774b3-ca7c-47c7-b580-393f6a432206.jpg'),
Path('Testing_Images/74e8bd5d-687f-4c95-b2a5-e1568ab8a7f1.jpg'),
Path('Testing_Images/74eb29cc-33a7-4ad4-b420-a4f4ba34cf19.jpg'),
Path('Testing_Images/74f69a44-9224-4c14-a6a4-cffef6d1a757.jpg'),
Path('Testing_Images/75508098-4cca-4037-90e5-0ae2ba6b4db9.jpg'),
Path('Testing_Images/756fc4f4-37ac-4cc6-b6d8-c10d30bf27f1.jpg'),
Path('Testing_Images/757980d7-0e1d-4568-832f-50ac017d1e9a.jpg'),
Path('Testing_Images/7580b25b-17e7-4627-93e0-057060fa01e3.jpg'),
Path('Testing_Images/7594b38b-54a7-4ee0-b5f6-694e12bcd830.jpg'),
Path('Testing_Images/75cf69a5-bdc1-4ef9-8937-2abad5bec289.jpg'),
Path('Testing_Images/75e23cd3-39cd-4afd-b15d-73dc9d78a337.jpg'),
Path('Testing_Images/75fa3ec9-b117-49cb-9d48-799821b3c471.jpg'),
Path('Testing_Images/7600fee2-7c16-43ff-b5e4-3b0c383156b4.jpg'),
Path('Testing_Images/769f98fc-59c7-4f3a-af65-80a75b679923.jpg'),
Path('Testing_Images/76cc4621-8f68-4394-98b4-1677b6314cbf.jpg'),
Path('Testing_Images/7738f4a5-a2fa-44eb-b96b-7d6e4a317327.jpg'),
Path('Testing_Images/776835b8-9137-4b15-abd4-3aebda8793d5.jpg'),
Path('Testing_Images/77a4a76f-d348-4dd4-b22a-d7b676121d78.jpg'),
Path('Testing_Images/788d8f3f-9060-49a3-8f25-24c08447c3c3.jpg'),
Path('Testing_Images/7892e538-8c5e-4521-a550-4a88f399bf63.jpg'),
Path('Testing_Images/789e8a7a-2ded-466f-8d50-17e8bc955978.jpg'),
Path('Testing_Images/78c9b88e-a134-4470-b161-22e4a698206c.jpg'),
Path('Testing_Images/78ce6b9a-db15-4576-8e8b-4c52650a457f.jpg'),
Path('Testing_Images/78d267de-aefb-4e2b-9489-e6c4fcd3a852.jpg'),
Path('Testing_Images/78df2b9e-8339-45fc-bbeb-d53467f8d4e2.jpg'),
Path('Testing_Images/7934f518-dbaa-4201-922b-1bc93a245e84.jpg'),
Path('Testing_Images/7944ddb2-ca88-4e4f-8311-65aa85ee58bd.jpg'),
Path('Testing_Images/796ceab6-d8ba-42ac-b75f-3863063a878d.jpg'),
Path('Testing_Images/7971ccf5-18a3-47d3-a2ea-eea88033404e.jpg'),
Path('Testing_Images/79795f56-4d8f-45a1-9eea-8e9f87dea646.jpg'),
Path('Testing_Images/79858395-3452-426d-bb99-e01af7961b44.jpg'),
Path('Testing_Images/7a10efd9-178c-485c-b1e3-605298a5bc76.jpg'),
Path('Testing_Images/7a4e441c-4497-47af-ade0-396d45782e03.jpg'),
Path('Testing_Images/7a8dd22c-7af7-428c-a043-578390fc6fa1.jpg'),
Path('Testing_Images/7a9147e6-2271-4a2b-b996-dd99237b71e7.jpg'),
Path('Testing_Images/7ae62f3e-f20c-4991-8d4d-cb0862bda103.jpg'),
Path('Testing_Images/7b67b1d4-9518-42a6-9d8a-05babb919b46.jpg'),
Path('Testing_Images/7b73919e-d14d-4f3c-88cd-e81a21ba756f.jpg'),
Path('Testing_Images/7b94c8f7-7f99-4499-bb56-9930bf999e7c.jpg'),
Path('Testing_Images/7bc6f7d5-aa35-4c1e-86d8-c4df27c483b8.jpg'),
Path('Testing_Images/7be096ac-d43e-47e6-8325-4d404ed8106b.jpg'),
Path('Testing_Images/7d3178fd-8811-4b12-8093-1ef9d8b32683.jpg'),
Path('Testing_Images/7dfe6d14-a31c-483a-be83-7a705e45719f.jpg'),
Path('Testing_Images/7e186a19-babe-4152-b882-e3eae737b9d7.jpg'),
Path('Testing_Images/7e403391-3766-4617-90c3-ce7dddcc826a.jpg'),
Path('Testing_Images/7e81879b-66ca-470a-903d-edce8bc7a218.jpg'),
Path('Testing_Images/7e85a356-c710-409a-95ea-a92e3dccb7f8.jpg'),
Path('Testing_Images/7f05b70f-2ad0-4db6-9e05-6c2baec8cd52.jpg'),
Path('Testing_Images/7f544a3f-e003-4f07-a3ce-6416930f0d0c.jpg'),
Path('Testing_Images/7f81fcb5-397d-4a3e-acaf-6837b5fc19d6.jpg'),
Path('Testing_Images/7f99c12d-5a9b-4d3c-b314-9f76ca17383c.jpg'),
Path('Testing_Images/7fc478ab-7eca-454d-885c-6aec22dc6903.jpg'),
Path('Testing_Images/7fe911ee-72a4-47a5-b98f-3662e5ac778e.jpg'),
Path('Testing_Images/7ff721f6-5055-4d4b-bd48-e6d49e4d3e7d.jpg'),
Path('Testing_Images/8005e8b9-ab58-45e2-b902-8057864630af.jpg'),
Path('Testing_Images/8160b6e6-9cbf-4083-b849-991b1eaed067.jpg'),
Path('Testing_Images/81961979-c8d9-4b8a-b185-30753d2926e4.jpg'),
Path('Testing_Images/81c5f05c-0a87-4db7-b2fa-6b930ac99b49.jpg'),
Path('Testing_Images/81d9d146-e9bb-478b-aeed-c2431780030e.jpg'),
Path('Testing_Images/81f6fcf9-88eb-47f2-b217-714feaf5289f.jpg'),
Path('Testing_Images/82099a84-dd20-48dd-a87a-7ffd3d948c8b.jpg'),
Path('Testing_Images/823a9297-1c70-4f5d-8f10-b33205761ca2.jpg'),
Path('Testing_Images/82ad571f-aa41-4f2c-b07f-de31c4d1a53c.jpg'),
Path('Testing_Images/82b3fca7-e397-487d-80bd-3b2fbd046f16.jpg'),
Path('Testing_Images/830b3d83-ff0b-4821-95a6-224eb7604bcc.jpg'),
Path('Testing_Images/833a38fd-dab6-4dd9-ba53-0f536b86555b.jpg'),
Path('Testing_Images/83d25ba1-ba1c-4d32-b505-5cf0cffeff4c.jpg'),
Path('Testing_Images/83e5b531-4609-4a91-81e3-ef67ba4c3d83.jpg'),
Path('Testing_Images/83eb118a-7a4d-4624-afb0-48f6358fd586.jpg'),
Path('Testing_Images/84309ddd-319f-4e6d-b312-2bc70537b074.jpg'),
Path('Testing_Images/8448316b-3072-4d30-a9d1-cf0b32473fb0.jpg'),
Path('Testing_Images/84c3c816-eb8b-4a44-ae5f-e589011ae9fc.jpg'),
Path('Testing_Images/851e6307-c481-4851-8040-e64d97ee6cd1.jpg'),
Path('Testing_Images/85727853-cb45-478d-a517-a0f0c85d0eb5.jpg'),
Path('Testing_Images/8589cf54-e34c-44a5-8860-0cc8ac7b6eb3.jpg'),
Path('Testing_Images/85c3ed41-f0d8-40e1-b439-0607ff697619.jpg'),
Path('Testing_Images/86375503-9b51-4796-8806-d4573cd9a532.jpg'),
Path('Testing_Images/86401948-d976-41b5-965a-120a8cc3899a.jpg'),
Path('Testing_Images/8672529d-7b85-4c2d-b9b0-6ecb13954d2d.jpg'),
Path('Testing_Images/86746808-b97b-4a47-b073-973fa4babd3f.jpg'),
Path('Testing_Images/86c27916-8e16-46f3-bfc3-41ad645dd672.jpg'),
Path('Testing_Images/86f92091-0f8b-400e-9a13-f3d61ad724a5.jpg'),
Path('Testing_Images/87233f70-c512-4a6b-8ece-a4cb41accb66.jpg'),
Path('Testing_Images/873f4d9b-de57-4129-a83c-0bfc6cdfa313.jpg'),
Path('Testing_Images/875b34e4-b58d-479f-954a-5b015071f448.jpg'),
Path('Testing_Images/87661ab3-35e0-47b1-b399-81d96fc762bc.jpg'),
Path('Testing_Images/8782c8e0-8592-47e0-95df-ee8224d6dec8.jpg'),
Path('Testing_Images/87c29c2a-db5a-46ba-8112-fe4a09a6e58e.jpg'),
Path('Testing_Images/87e02682-fc89-44c0-8a7c-866f0dcaa8eb.jpg'),
Path('Testing_Images/881ed5f0-cf38-462a-8f31-b037a5fc0177.jpg'),
Path('Testing_Images/88527165-a8f1-4859-9b94-a1300aa65659.jpg'),
Path('Testing_Images/8882a89a-09fa-4724-9e1b-283023523b01.jpg'),
Path('Testing_Images/889fbb03-ad00-4da1-abea-6ad0a8843335.jpg'),
Path('Testing_Images/88a033e5-b48c-4f0c-aa0a-3c0ae353cc0d.jpg'),
Path('Testing_Images/88dfac4d-30d4-42fe-9122-82246ecee401.jpg'),
Path('Testing_Images/89064bf9-0a79-4ddb-b572-775200d94f15.jpg'),
Path('Testing_Images/89168da5-8bd2-4c94-8011-5c9305bb85f0.jpg'),
Path('Testing_Images/89492f31-d3a7-4e3f-b715-efa858eb1c8c.jpg'),
Path('Testing_Images/895c1071-db4e-4163-a754-1f3e0d32d439.jpg'),
Path('Testing_Images/89692e2b-c54d-4104-acba-8ff010111524.jpg'),
Path('Testing_Images/89875600-76f0-426e-96bc-0d9c84a4a2d0.jpg'),
Path('Testing_Images/8996833b-bd93-4721-bf63-1da145e7f851.jpg'),
Path('Testing_Images/89d63d9c-91be-49b1-a072-a4076a35a29b.jpg'),
Path('Testing_Images/89e23e03-487d-49dd-ae1e-b42efe3ae119.jpg'),
Path('Testing_Images/89ff3f4d-2631-42d0-8d82-8017c4dcce7c.jpg'),
Path('Testing_Images/8a546535-613f-49b1-897c-a0933c53cb84.jpg'),
Path('Testing_Images/8b0cf1ed-4b24-4bcf-9474-d7d2716382db.jpg'),
Path('Testing_Images/8b60f253-6ae0-4b16-89bc-f3e3163dbbae.jpg'),
Path('Testing_Images/8be3494d-fe57-47d2-9dd1-ac5db3781850.jpg'),
Path('Testing_Images/8be844aa-e84b-4496-b67e-707e6074ca30.jpg'),
Path('Testing_Images/8d2681aa-f9f9-4ca6-92e9-27fbb546d0e5.jpg'),
Path('Testing_Images/8d530b59-727f-4f00-89b1-26cdfb116a04.jpg'),
Path('Testing_Images/8d65cac2-1aa0-4158-a81f-df8367abbd46.jpg'),
Path('Testing_Images/8d7a4925-c838-449c-bb08-8e7699322366.jpg'),
Path('Testing_Images/8d837773-a674-4af2-b401-2b3216586c5e.jpg'),
Path('Testing_Images/8dd9cf58-1874-4592-b5aa-f5fd962337fd.jpg'),
Path('Testing_Images/8ed37733-e182-4508-9785-5a546fd41557.jpg'),
Path('Testing_Images/8efbae3f-42e7-45f4-8cc8-be6a572bbe73.jpg'),
Path('Testing_Images/8f0500c9-dcdd-4c0f-8ba5-452c0af5fc02.jpg'),
Path('Testing_Images/8f0e79ea-5db6-4a83-9886-c7d58f5f4825.jpg'),
Path('Testing_Images/8f719c99-d7f6-4e27-a5f6-3a259dc60b61.jpg'),
Path('Testing_Images/8f8e1dc5-bbc9-40ef-a735-c1f1cf71138a.jpg'),
Path('Testing_Images/8fa2efa3-f8c8-4b89-9c22-e6468ec696d8.jpg'),
Path('Testing_Images/90440659-a140-451d-9ddb-2908d4408c93.jpg'),
Path('Testing_Images/9067a3b6-b8bf-49a9-bc76-675853dd37b5.jpg'),
Path('Testing_Images/90710ba1-7628-4b4c-ac71-a61944ba3825.jpg'),
Path('Testing_Images/90a4a31f-ffc0-440f-bdc9-975ef54bba40.jpg'),
Path('Testing_Images/90b2b94e-efa8-4ff7-98e1-86ea3e33b575.jpg'),
Path('Testing_Images/90c345aa-be44-4390-9f4f-12d02643560d.jpg'),
Path('Testing_Images/90ca0844-924a-4f74-926e-7561a808c0f4.jpg'),
Path('Testing_Images/90de382f-ab3f-4d2a-a4c8-ef4085287976.jpg'),
Path('Testing_Images/9168c2e9-87c1-434a-bbf5-bd8ee1d0e124.jpg'),
Path('Testing_Images/916a0cd0-d682-43e0-aeb2-8da59c626599.jpg'),
Path('Testing_Images/916c0fd1-2ec0-42cb-a0f0-378dc882aae2.jpg'),
Path('Testing_Images/91b2ce38-efaf-4476-b77c-4f363d340f11.jpg'),
Path('Testing_Images/91d66175-55d7-4a72-ac9f-e07604dde06a.jpg'),
Path('Testing_Images/91d9f69e-f6da-4f70-b539-7a3133714594.jpg'),
Path('Testing_Images/925a8700-508c-42a2-a617-1855f29f8877.jpg'),
Path('Testing_Images/92965444-fa89-41e8-b0bf-c132789b8295.jpg'),
Path('Testing_Images/92d8d10a-d3b3-41a3-be93-ad21dea0d5bf.jpg'),
Path('Testing_Images/92dec024-edba-43be-973d-b2d58886e7c3.jpg'),
Path('Testing_Images/92e0dd85-743f-4b58-b079-9b4909f71d43.jpg'),
Path('Testing_Images/9327bd1e-f734-4c74-b797-b9276d4f431c.jpg'),
Path('Testing_Images/932d09e6-2597-4a6d-a7fd-f6c07de79bb0.jpg'),
Path('Testing_Images/9341907e-9b62-4197-a0b3-3ddd3c55a884.jpg'),
Path('Testing_Images/93727b85-2fa2-478d-aecc-46e14f391dd5.jpg'),
Path('Testing_Images/93c1ea7f-20b2-4860-9beb-fd0ad2f54bd6.jpg'),
Path('Testing_Images/93e10690-53ab-4acc-99f6-64bd37250e86.jpg'),
Path('Testing_Images/93eb36cf-5356-4be3-b740-1c4d7a842741.jpg'),
Path('Testing_Images/94211273-5b21-47fa-bbe0-7496e66672ac.jpg'),
Path('Testing_Images/9471cd26-848c-4d14-98cf-64c9bda9929c.jpg'),
Path('Testing_Images/94808997-3c67-4d45-bf6e-423f3705d30d.jpg'),
Path('Testing_Images/9560b299-0367-42d7-b627-32a7212015f0.jpg'),
Path('Testing_Images/957f244a-9680-40f3-8b04-367fcad6e6dc.jpg'),
Path('Testing_Images/958f4388-559f-482c-9b7f-33eb9d64ea61.jpg'),
Path('Testing_Images/95eac19c-8aa4-4955-a3da-8bcbb6f6b639.jpg'),
Path('Testing_Images/95ef1bd3-c207-4d70-88fd-c37c275086bb.jpg'),
Path('Testing_Images/960b5087-3abf-4a92-819d-c72dceae1d2b.jpg'),
Path('Testing_Images/964dc796-8083-4c35-b97a-e59e9b938964.jpg'),
Path('Testing_Images/9673d5f3-fc1c-49bc-902a-81ff14db1da0.jpg'),
Path('Testing_Images/9760925f-858b-4470-aaeb-5908bf03672a.jpg'),
Path('Testing_Images/97fe9a9a-3b11-4cbe-b053-2d5de0739dff.jpg'),
Path('Testing_Images/98833b28-3925-4098-bf44-c81a9e687838.jpg'),
Path('Testing_Images/988a4bbd-3a14-4b4b-b846-6729e35fcd91.jpg'),
Path('Testing_Images/9911bf13-bce6-4b33-b866-1944aca8fab0.jpg'),
Path('Testing_Images/99259c38-32f9-42fd-bb3e-8a3b59521784.jpg'),
Path('Testing_Images/999c7ad4-22ab-440a-8298-9ddffaaecfc3.jpg'),
Path('Testing_Images/99f99868-4c89-47f0-adbb-5a9ca675002d.jpg'),
Path('Testing_Images/9a05ffb1-0988-4f71-89a3-2eac51deb62e.jpg'),
Path('Testing_Images/9a543e59-766e-4da2-82aa-a891682b2793.jpg'),
Path('Testing_Images/9a609592-f47b-4c77-ad7b-be4f8833aa1d.jpg'),
Path('Testing_Images/9a714fc0-da5f-4587-aa33-38045523e322.jpg'),
Path('Testing_Images/9b315c4f-e7dc-4fc8-88b4-abcea9f8ab46.jpg'),
Path('Testing_Images/9b3f0904-75c5-448b-aab9-c02a191da2cd.jpg'),
Path('Testing_Images/9b6f8882-b59f-4a4d-8f48-b3f355fd70b9.jpg'),
Path('Testing_Images/9b8a86aa-9a71-4b5a-a081-0589c72f3a24.jpg'),
Path('Testing_Images/9bb53dc0-28ed-47ed-bff3-9e6cc3c03d62.jpg'),
Path('Testing_Images/9bf09456-da28-4206-9e8b-fc076b5fc0f3.jpg'),
Path('Testing_Images/9c2875c7-fa39-4af6-a82d-b545dc59874d.jpg'),
Path('Testing_Images/9cde0ff6-095f-4ac6-88a8-c9ffc4676230.jpg'),
Path('Testing_Images/9cf09192-094a-46cd-9be8-c7be106d90d9.jpg'),
Path('Testing_Images/9d0cbf60-07ab-4621-950a-9ee60910d2aa.jpg'),
Path('Testing_Images/9d801278-8699-4eee-ac13-14de9cc773d1.jpg'),
Path('Testing_Images/9d93caf3-5af3-48a0-9a6e-23bec560da26.jpg'),
Path('Testing_Images/9dc1e167-1784-45cd-9fad-9bd7c8c7d904.jpg'),
Path('Testing_Images/9dcfa483-4ed7-4cc8-954e-eb7c188e914d.jpg'),
Path('Testing_Images/9dd6e2cd-3afe-4a03-a3d3-de161a0446a7.jpg'),
Path('Testing_Images/9e53a361-9507-4b07-9b3b-56bad312fb6a.jpg'),
Path('Testing_Images/9f2e86d5-cf74-4c4f-84f9-c9e842c8ef0d.jpg'),
Path('Testing_Images/9f70e16b-6372-47ca-8589-1862539f0655.jpg'),
Path('Testing_Images/9facb2f2-d1bd-4b4f-bc13-b800841ab0d0.jpg'),
Path('Testing_Images/a0688bb5-65a3-4b02-b0a5-cd0dc775608d.jpg'),
Path('Testing_Images/a0999072-21a9-4898-b391-5990998eca20.jpg'),
Path('Testing_Images/a09a64a7-dcdd-4325-a76a-2093989fb1f8.jpg'),
Path('Testing_Images/a0ca289b-6e0b-433d-a45b-8f86044e1ae1.jpg'),
Path('Testing_Images/a14e460a-c153-4813-896e-34f7bac2b49b.jpg'),
Path('Testing_Images/a1538feb-5647-463b-9876-ca6e5ff00f48.jpg'),
Path('Testing_Images/a1870588-5aa1-4a1e-92cc-62383776659f.jpg'),
Path('Testing_Images/a1aff3c3-4eae-40ed-800d-d5aa693e1132.jpg'),
Path('Testing_Images/a1c8a30b-9978-4be6-9be1-d55456ef2480.jpg'),
Path('Testing_Images/a1d1d277-41a0-43ec-bee0-6d4bde45cca1.jpg'),
Path('Testing_Images/a20667cb-b7b3-454c-8fa8-2acdd4bd0604.jpg'),
Path('Testing_Images/a26bdab3-55db-47a6-8641-1371bd6d2568.jpg'),
Path('Testing_Images/a2708a20-2a3c-43e4-9135-c395e7ba4fc5.jpg'),
Path('Testing_Images/a2cac936-8602-408f-9447-41691428bd0f.jpg'),
Path('Testing_Images/a2eea84a-9fd7-4c1a-8d0f-95e23c0d659b.jpg'),
Path('Testing_Images/a304611a-7d97-428b-a09e-26c9d11d4f51.jpg'),
Path('Testing_Images/a30808af-7822-4f47-beaf-0c1a840cd980.jpg'),
Path('Testing_Images/a36e7035-fe1e-4145-85cc-e893094beb3a.jpg'),
Path('Testing_Images/a392cdea-a4ca-4eb5-9922-ca15f9db0cee.jpg'),
Path('Testing_Images/a3c04f4d-3621-4679-872b-fad6dd79a4f8.jpg'),
Path('Testing_Images/a3cf5488-9d3f-4bd9-9e6f-9891e1f992fb.jpg'),
Path('Testing_Images/a3d2d6c4-0ae9-4bd6-b8d5-04ffaa1ee870.jpg'),
Path('Testing_Images/a3d30007-2d62-4c16-afbe-8ad3766978b9.jpg'),
Path('Testing_Images/a3dbcd82-9801-4981-83e3-303b294dc5f6.jpg'),
Path('Testing_Images/a4160584-0311-47f5-877b-9ab95fc2d553.jpg'),
Path('Testing_Images/a472a01d-2145-4d5f-bee6-588b785bf036.jpg'),
Path('Testing_Images/a4a28a4f-9496-430c-8cc1-cff2b44ac359.jpg'),
Path('Testing_Images/a4a5cd1a-bd42-42e2-89f9-b1617da60a68.jpg'),
Path('Testing_Images/a4c3d978-f2f0-4e3a-9da1-e9176a153b4e.jpg'),
Path('Testing_Images/a500fcb1-de25-4425-879c-124e7eff6ae5.jpg'),
Path('Testing_Images/a5282327-a8ac-41c1-b4fa-f6a5db60d992.jpg'),
Path('Testing_Images/a53df4cc-6693-42a7-af15-fdbe00adcaa2.jpg'),
Path('Testing_Images/a5a9dd5d-a04a-4085-a84f-3fb30f1ffba3.jpg'),
Path('Testing_Images/a5e1f9e8-784c-415c-bded-f83e6130a111.jpg'),
Path('Testing_Images/a662a526-362a-4c97-9407-3ac207c5ba81.jpg'),
Path('Testing_Images/a6764ebc-8c1d-4803-a4b7-06c1310fae8d.jpg'),
Path('Testing_Images/a6a2cfa8-b17d-412a-b43c-6feac3befd61.jpg'),
Path('Testing_Images/a6a420f8-b676-49e1-9273-fd1b9f9fdfc9.jpg'),
Path('Testing_Images/a6aa1429-135a-49ed-bbaa-ebede9b1b63c.jpg'),
Path('Testing_Images/a6b07082-0fb9-48a6-8c53-2c40ea901b62.jpg'),
Path('Testing_Images/a7714d9a-c359-4edd-94a4-c6943c437021.jpg'),
Path('Testing_Images/a782d9c2-e190-4e03-9310-5750dba6ff81.jpg'),
Path('Testing_Images/a7adf94f-aac5-4d65-a860-1b54596bcb51.jpg'),
Path('Testing_Images/a7b5d4ff-840c-4db3-bf4a-d5c71b49f754.jpg'),
Path('Testing_Images/a7bbb7d4-3dab-4510-81b6-d79020126058.jpg'),
Path('Testing_Images/a7bfee8a-971f-4164-ad8f-8b65930a7da6.jpg'),
Path('Testing_Images/a7cc220e-bef4-4d27-a921-7773ebef1282.jpg'),
Path('Testing_Images/a7e875c8-2f8e-4eb1-bb43-5a5f7d206ba9.jpg'),
Path('Testing_Images/a83f3266-8fe1-4d4f-86a9-94ee9b7e8c4a.jpg'),
Path('Testing_Images/a85f6e2b-513c-4469-982d-8b20aeeab65f.jpg'),
Path('Testing_Images/a85fcd51-0b79-452d-8afa-05d90cdeedcf.jpg'),
Path('Testing_Images/a867e456-d903-4beb-ab77-dff28e2ddfc2.jpg'),
Path('Testing_Images/a87fca97-b625-46c0-a9af-9456c03aaaaf.jpg'),
Path('Testing_Images/a8860ea6-3b22-46d8-a2b5-c9a627fdf408.jpg'),
Path('Testing_Images/a89eb468-2a15-4370-8725-70e8a1ff9007.jpg'),
Path('Testing_Images/a8aae1e6-2a7a-443b-aaed-aecc02bb3e39.jpg'),
Path('Testing_Images/a8f42699-1106-428f-90ad-db6a72118122.jpg'),
Path('Testing_Images/a940da0e-5033-4069-9318-0f675dfcfc16.jpg'),
Path('Testing_Images/a95c830d-b00d-40cd-abd4-1e256ce80721.jpg'),
Path('Testing_Images/a95db573-b59a-4a7d-803e-6bad441499f3.jpg'),
Path('Testing_Images/a9703aff-6dab-484f-a9d7-7c7fa4d10e21.jpg'),
Path('Testing_Images/a9cfed74-19bf-454f-b3a2-32c455a97cc8.jpg'),
Path('Testing_Images/aa2da4e8-67d5-4ad1-bd7f-7b013ecda555.jpg'),
Path('Testing_Images/aa47c55a-7cf7-4105-9132-de080664f052.jpg'),
Path('Testing_Images/aa713df7-cd09-4208-87e5-887ac53655c0.jpg'),
Path('Testing_Images/aac98e99-3d0f-4bf2-a38b-15d4e764a3ce.jpg'),
Path('Testing_Images/aad5a7ee-f327-4444-9a29-f279d3832286.jpg'),
Path('Testing_Images/aafda091-8953-40c4-802e-853a206e50d2.jpg'),
Path('Testing_Images/ab1f2146-1c03-4d96-bc80-d5aaefb9b95d.jpg'),
Path('Testing_Images/ab2fd39b-2c9a-4fe7-852e-c1019ee3e602.jpg'),
Path('Testing_Images/ab39973c-ae43-4ded-8c33-678bb4f28988.jpg'),
Path('Testing_Images/ab41e71a-fe4b-4d99-9e46-da80aeba0ee8.jpg'),
Path('Testing_Images/ab6d8833-6583-43d2-8220-471ace40b569.jpg'),
Path('Testing_Images/ab9069f0-1511-414d-b37a-2920517004d4.jpg'),
Path('Testing_Images/ab9eccef-dfdf-4fba-b1f9-fccdda382eff.jpg'),
Path('Testing_Images/abfa0725-2fe4-4621-b696-b256d126f404.jpg'),
Path('Testing_Images/ac10f43e-03b8-43d2-bcae-6fe000542397.jpg'),
Path('Testing_Images/ac166a48-f6e6-4236-8a92-c41b1e1ed87d.jpg'),
Path('Testing_Images/ac2257b8-e600-42bd-ab9c-951fa49b3ec6.jpg'),
Path('Testing_Images/ac2f43b0-3e63-47e0-9e50-4ba59b9b747e.jpg'),
Path('Testing_Images/ac338f2a-61c9-4664-b97e-23e05c260b02.jpg'),
Path('Testing_Images/ac4bf897-0752-4095-9404-1e6ff42a1dfb.jpg'),
Path('Testing_Images/ac648f87-a79f-4ce0-9439-23a8dd886b71.jpg'),
Path('Testing_Images/ac6b624d-71ba-44ca-957e-592a715f0ac2.jpg'),
Path('Testing_Images/ac6e81b1-37eb-4bd6-8a58-8c8f246fc16b.jpg'),
Path('Testing_Images/ace3688f-2479-4468-9759-1e9a1fe02d6f.jpg'),
Path('Testing_Images/ace5cd2d-a76f-46d6-9800-2619598b12d7.jpg'),
Path('Testing_Images/acf6cb2d-61f2-4172-9f0b-e010c583cb9f.jpg'),
Path('Testing_Images/ad04815f-f706-4323-bc71-a99cf971db8a.jpg'),
Path('Testing_Images/ad5a2700-3388-499a-8584-d811cacac4ac.jpg'),
Path('Testing_Images/ad7ff8d9-6c8c-4569-894e-99b0210d042b.jpg'),
Path('Testing_Images/ad95ae75-d5e0-456b-a375-43efc5c21ee1.jpg'),
Path('Testing_Images/adbe1d68-1b85-4506-b97f-576f461d2697.jpg'),
Path('Testing_Images/add89c20-44cf-461a-9857-c41d753d4062.jpg'),
Path('Testing_Images/adf5f84f-432e-4b58-bbc2-6121f9119dd9.jpg'),
Path('Testing_Images/ae214082-c46c-43cb-9c12-66090fa3d011.jpg'),
Path('Testing_Images/ae4e1697-0d04-4af4-8a60-a18eaa62ed78.jpg'),
Path('Testing_Images/aec75bea-7a4a-4ca6-8e82-bf3f05963c9d.jpg'),
Path('Testing_Images/af42a7f3-092c-4d10-b5a2-5806a1a8aea6.jpg'),
Path('Testing_Images/af652954-95f0-4406-a5bc-aa1e6d19bfe4.jpg'),
Path('Testing_Images/afeaa6a4-59e1-4bd5-9d2b-41fe94e789fe.jpg'),
Path('Testing_Images/afeacaf9-b9e4-4f0d-a200-17b94ac64b9a.jpg'),
Path('Testing_Images/aff963a8-7c66-453f-b247-b99fac926e2f.jpg'),
Path('Testing_Images/b03867f9-6a0e-4115-ac46-ed96db6b6644.jpg'),
Path('Testing_Images/b039548f-9627-4701-951b-ae89efd3789f.jpg'),
Path('Testing_Images/b0469327-31ae-457e-b1d8-31d1a99fc0d6.jpg'),
Path('Testing_Images/b05766ae-c0d4-4e29-855b-1112c7ae956e.jpg'),
Path('Testing_Images/b071fe9e-d11b-4672-8c4a-bbef34635b7c.jpg'),
Path('Testing_Images/b0b71bfa-9d78-4325-87e9-7b5f03d6cc99.jpg'),
Path('Testing_Images/b0e2b584-8e6f-4849-88ad-778a07abc055.jpg'),
Path('Testing_Images/b134ec86-1a70-49fa-89cc-b43686c73d0f.jpg'),
Path('Testing_Images/b145a972-a2b6-4dca-b98b-37d41f7c371f.jpg'),
Path('Testing_Images/b159a6c8-a3de-419a-a9c4-41309ef57e0a.jpg'),
Path('Testing_Images/b18a2d86-3e3a-43ec-b9f6-0fd249bc1a0d.jpg'),
Path('Testing_Images/b1976cfc-3bae-4bae-a968-4892e6f953ca.jpg'),
Path('Testing_Images/b26151b2-5f68-4af1-9730-4d608dfde81a.jpg'),
Path('Testing_Images/b28ef305-54b7-4a24-9edc-00eb8e13d649.jpg'),
Path('Testing_Images/b29e295b-367c-4787-90da-e9d506bdce45.jpg'),
Path('Testing_Images/b2d2dacc-918c-4962-9631-3fcc7e94fe16.jpg'),
Path('Testing_Images/b2f5b10a-917c-4080-8046-ec1c9fb9cb39.jpg'),
Path('Testing_Images/b3198fbd-a80a-4b5b-8a12-f89bb4a72b4a.jpg'),
Path('Testing_Images/b33f8121-a8ca-4a55-a876-7c21714bf80a.jpg'),
Path('Testing_Images/b34524bc-e812-4067-a0e8-bd7185bfb814.jpg'),
Path('Testing_Images/b3671358-67b4-4e10-93af-c62a561d1546.jpg'),
Path('Testing_Images/b371a4b2-fbcb-4d0b-8cfd-028fae13c079.jpg'),
Path('Testing_Images/b3e77c82-8f60-42bc-b792-cf201fe15060.jpg'),
Path('Testing_Images/b3f132cb-438b-40d4-a951-0089e8b1a840.jpg'),
Path('Testing_Images/b42af133-083e-44d8-8b5e-303dda0a2ba9.jpg'),
Path('Testing_Images/b450fa50-1d6d-4e97-b843-fc0fe530d2ff.jpg'),
Path('Testing_Images/b460b35c-214c-45b6-8306-3bf234116ca9.jpg'),
Path('Testing_Images/b475e187-b7bd-4d5d-951f-82e4903feacb.jpg'),
Path('Testing_Images/b479635e-4c58-4d27-879b-fb0e87b2a1aa.jpg'),
Path('Testing_Images/b4c72207-a7d3-4b02-8ede-1ea64080591c.jpg'),
Path('Testing_Images/b4f8176e-a3a7-4a9e-90ad-f2b80ac60e33.jpg'),
Path('Testing_Images/b544bbb2-35d1-44a9-88a6-431aa0fe72ad.jpg'),
Path('Testing_Images/b5a5a9a8-9fe4-4eff-8ebb-9da7b9cb014d.jpg'),
Path('Testing_Images/b5a6e376-eb68-485a-a08d-a4cca17cbf9c.jpg'),
Path('Testing_Images/b5bff695-a9d3-49c9-b25e-a64a8956f79e.jpg'),
Path('Testing_Images/b5cb73ad-33bb-4adb-9123-e645becd46d9.jpg'),
Path('Testing_Images/b5ccb5ba-0bff-425d-aeb8-f72f6d9dc283.jpg'),
Path('Testing_Images/b5eebe73-0b9c-4621-abcd-940f6e90a86c.jpg'),
Path('Testing_Images/b617818a-6a54-4dbd-997f-26723bb46634.jpg'),
Path('Testing_Images/b61e3955-0a8d-4678-9746-b7202fd6d777.jpg'),
Path('Testing_Images/b6276787-bf66-4151-9f7a-b3d7fdf5ae30.jpg'),
Path('Testing_Images/b63f4c12-5731-41e2-8d89-2ae0c4daba41.jpg'),
Path('Testing_Images/b652cdf0-6686-4ca4-9ae4-54bc846b4ecc.jpg'),
Path('Testing_Images/b6881b42-dbed-4d4b-b631-29def07e2989.jpg'),
Path('Testing_Images/b6ab1a11-6558-4f76-bed2-3b169f820be9.jpg'),
Path('Testing_Images/b6adc779-5eb6-4afc-8672-986ba03b0dfc.jpg'),
Path('Testing_Images/b73ed36a-96ce-4124-b9cd-a74d883e9bfd.jpg'),
Path('Testing_Images/b7637d3c-fa05-45c1-816a-1bbb95f07af1.jpg'),
Path('Testing_Images/b77669dd-8554-4531-9903-9719747fdb63.jpg'),
Path('Testing_Images/b78f2713-e211-408e-a810-e0568fc0f0a0.jpg'),
Path('Testing_Images/b7b167ec-a640-4127-90c6-56ac2e9a4bdb.jpg'),
Path('Testing_Images/b7bda9f8-36fa-49a1-83e2-52889114a465.jpg'),
Path('Testing_Images/b7ee39b2-b083-40f8-b75b-1597ebda0c68.jpg'),
Path('Testing_Images/b829458e-0a00-4d7f-9b19-0efd59f2cd7a.jpg'),
Path('Testing_Images/b82d06aa-4036-4ad0-acf1-99b127f2ce17.jpg'),
Path('Testing_Images/b86e639a-5938-4126-b4fd-fc62f89ae7cf.jpg'),
Path('Testing_Images/b89ae0ff-5e01-482f-9c6d-6468d734ef02.jpg'),
Path('Testing_Images/b8a18a7b-502a-487a-98aa-0edd1c1126cb.jpg'),
Path('Testing_Images/b8a563db-5bfd-4638-9b55-30c3a230485f.jpg'),
Path('Testing_Images/b8cda939-dfb4-455b-9ff4-17e199b3ad1f.jpg'),
Path('Testing_Images/b98df72f-9bd3-4523-940a-b4075f1ff311.jpg'),
Path('Testing_Images/b99b5a9a-d2cc-4e46-a419-cc7f1006dedd.jpg'),
Path('Testing_Images/b9b5a3db-2692-4fbc-a083-f86a2f3f7e6f.jpg'),
Path('Testing_Images/b9cd73cb-2dfc-4c9b-9ea0-61c0d0b10a87.jpg'),
Path('Testing_Images/b9cd9ce8-2450-44bc-8e83-9cab6b7b35ef.jpg'),
Path('Testing_Images/ba14333d-9629-4a0f-9517-b5899fecbe89.jpg'),
Path('Testing_Images/ba29c8c8-c116-4b8e-b8d6-ec870ca9f9e5.jpg'),
Path('Testing_Images/ba8d3130-4063-4339-bbfa-474259060f5b.jpg'),
Path('Testing_Images/bac5d49a-c506-4db8-b9e6-e3bec2f8abdd.jpg'),
Path('Testing_Images/baf2ce5d-4e43-47b7-88d0-4dd0948cea84.jpg'),
Path('Testing_Images/bb0e4a7c-5f47-4c43-892c-2ab317601edf.jpg'),
Path('Testing_Images/bb189459-2735-44c6-a344-a645a9721dfb.jpg'),
Path('Testing_Images/bbca9bd4-53b2-4076-a3b3-a1ee4079faec.jpg'),
Path('Testing_Images/bc2a8121-cccd-47d8-8289-2dbfacf6b94f.jpg'),
Path('Testing_Images/bc43dbd7-03e5-450d-aa3c-d0efed12a8f9.jpg'),
Path('Testing_Images/bc54c1d6-8a51-4bae-adaf-d2b7ec7c4323.jpg'),
Path('Testing_Images/bc651497-8014-4429-a93b-5860fe9a205a.jpg'),
Path('Testing_Images/bc6dbf80-6754-4d9e-8bd8-2c225594cf7c.jpg'),
Path('Testing_Images/bc7b5348-574e-49a2-a20a-719d942a9c31.jpg'),
Path('Testing_Images/bc806df5-0a3a-40aa-80c5-5b404759a20b.jpg'),
Path('Testing_Images/bc92c935-81cf-4e5b-8edc-381c0327072b.jpg'),
Path('Testing_Images/bca2c151-2431-4595-80c7-75480a196c6e.jpg'),
Path('Testing_Images/bca2cc4f-0482-4a86-9bee-2ed85b8dbd75.jpg'),
Path('Testing_Images/bca35324-a10c-435e-8af3-c14753706834.jpg'),
Path('Testing_Images/bca6d69c-5695-4660-bcc2-2e5f4e479135.jpg'),
Path('Testing_Images/bce23f47-7e90-48be-ab13-be220200fba0.jpg'),
Path('Testing_Images/bcfbbdcc-f648-4e72-931b-1747f502eb35.jpg'),
Path('Testing_Images/bd04fa63-5ecd-42da-bf2f-6214896bfc3e.jpg'),
Path('Testing_Images/bd0bb326-cd0d-4dd8-be1c-25c15b076377.jpg'),
Path('Testing_Images/bd4c0bd3-4b98-4c74-8cd3-6213d2e7c279.jpg'),
Path('Testing_Images/bd4dffe6-898d-4d23-b5a1-8efd7d2d5a19.jpg'),
Path('Testing_Images/bd56f37d-5561-40f9-bd09-1363786eab67.jpg'),
Path('Testing_Images/bd5c5bfb-67de-4696-b4a9-c50ba34f0fbc.jpg'),
Path('Testing_Images/bd73138b-5909-47be-bfd7-ae53e4a3c765.jpg'),
Path('Testing_Images/bd9eaecd-1c8a-4be5-873b-ef4738c852f3.jpg'),
Path('Testing_Images/bda1d8b6-2a74-41c3-8cef-3ff7864e4613.jpg'),
Path('Testing_Images/bdbf26d1-2706-49bf-9554-84a881440880.jpg'),
Path('Testing_Images/be00aef3-ebcf-4306-9960-a41b3c0cdafa.jpg'),
Path('Testing_Images/be033abc-95d1-4d7f-baa8-2bece16b67aa.jpg'),
Path('Testing_Images/be200106-3457-4b35-b328-9da7a752f0ff.jpg'),
Path('Testing_Images/be265039-f0ab-4d31-9859-2de2f9d97492.jpg'),
Path('Testing_Images/be27131b-ca91-4ba0-bf1c-7b6f9f57b4b0.jpg'),
Path('Testing_Images/be342863-9ecf-4861-9502-80b6f0380e94.jpg'),
Path('Testing_Images/becced78-9df7-4e9a-94c8-7cb750d41507.jpg'),
Path('Testing_Images/bed0dd75-f279-4fb1-b2be-97222380231c.jpg'),
Path('Testing_Images/bedf2835-76ba-4bd4-8fef-a7c767b7743b.jpg'),
Path('Testing_Images/bf10ff6e-fc24-4cfc-9b5f-7a156608dc1a.jpg'),
Path('Testing_Images/bf1103d5-18c7-4578-9ad2-a7e279d9a6a3.jpg'),
Path('Testing_Images/bf3d7cc5-cde5-494f-ab8f-e33660767f18.jpg'),
Path('Testing_Images/bf3ef845-34ab-4b80-81e7-ac7a3b23485a.jpg'),
Path('Testing_Images/bf4ba376-dfef-4d19-bbd6-923325b58eb1.jpg'),
Path('Testing_Images/bf5255f6-1130-444c-957b-b2ef01d01167.jpg'),
Path('Testing_Images/bf6071e6-0a92-4896-83bf-f0f24cd1b4d6.jpg'),
Path('Testing_Images/bf7225ea-722d-483c-8fd6-3118966ca7b1.jpg'),
Path('Testing_Images/bf7bc9a6-d454-4edd-8752-9273448a3b10.jpg'),
Path('Testing_Images/bfbeea84-2d5a-4b04-beb4-15d3b339e981.jpg'),
Path('Testing_Images/bffde351-7ccf-49cd-aa8b-893c7d359e29.jpg'),
Path('Testing_Images/c00bf44e-5b65-470d-8ddb-b704e630bd5e.jpg'),
Path('Testing_Images/c0608b6d-a375-4cda-8133-b5ec73217435.jpg'),
Path('Testing_Images/c060c23e-dff3-4cbd-b823-b8e70b80b2b4.jpg'),
Path('Testing_Images/c1e73a4e-7afe-4ec5-8af6-ce8315d7a2f2.jpg'),
Path('Testing_Images/c1edf42b-5958-47ff-a1e7-4f23d99583ba.jpg'),
Path('Testing_Images/c203f4e6-0963-4543-a34c-48b9d62b3e27.jpg'),
Path('Testing_Images/c208db34-d893-4a2b-9cf2-45490f5b78a3.jpg'),
Path('Testing_Images/c21157da-1f74-4966-ba4e-c144d8f5da2d.jpg'),
Path('Testing_Images/c22ef690-d5e9-41ae-a896-11643a0fe168.jpg'),
Path('Testing_Images/c236d6e2-1fca-40c5-a0fe-a91ff60883c9.jpg'),
Path('Testing_Images/c241425c-418c-4958-8a18-c3425fe03f03.jpg'),
Path('Testing_Images/c30dc668-2110-4948-8b5c-ccf2af187953.jpg'),
Path('Testing_Images/c33d14fe-bae6-458c-b6d8-b36fce161298.jpg'),
Path('Testing_Images/c37cfadd-1e52-4704-9831-403826ad2974.jpg'),
Path('Testing_Images/c380ed72-3cdd-42d2-b482-b2bcb9a3541a.jpg'),
Path('Testing_Images/c3853f9a-a16f-4ee4-8bdb-5f636755855d.jpg'),
Path('Testing_Images/c3bef210-65b4-4bb1-bfa1-15ae213bfa8e.jpg'),
Path('Testing_Images/c3c78e4e-1a31-4a92-be4b-0fc3c7f992a4.jpg'),
Path('Testing_Images/c3d8144a-2426-4e80-80c0-651dfe0116e8.jpg'),
Path('Testing_Images/c3e2a1e6-f5b2-4bf5-956f-b37dd052c1d4.jpg'),
Path('Testing_Images/c4095ef6-7014-420e-a40d-3a3f711c500f.jpg'),
Path('Testing_Images/c4400499-489b-4867-973a-06747a8261bb.jpg'),
Path('Testing_Images/c458811a-0a07-4e09-ad2d-4fa714ae6d7d.jpg'),
Path('Testing_Images/c47c6e10-973d-4301-9e9a-3dc9893d3ace.jpg'),
Path('Testing_Images/c4969e49-f2c8-423c-9181-28c2e93daa20.jpg'),
Path('Testing_Images/c4988ca9-812e-4edc-8e86-8f0cbc81e077.jpg'),
Path('Testing_Images/c4acd677-ae5e-4b23-b888-2e8e2e1c3996.jpg'),
Path('Testing_Images/c4cc9ebb-23a6-4f83-902b-298b1461b9e0.jpg'),
Path('Testing_Images/c4eae7c6-5e5c-475b-be27-6642821d04fa.jpg'),
Path('Testing_Images/c4f62eba-75ac-4733-b296-d285e5e22509.jpg'),
Path('Testing_Images/c573cb3e-54e6-4ff0-bda4-1fd7621ee0cf.jpg'),
Path('Testing_Images/c602d3b5-41ba-4633-bc25-46fc71a09d57.jpg'),
Path('Testing_Images/c69b465a-7bfe-42c7-8896-0c4a419854f7.jpg'),
Path('Testing_Images/c6af753a-b070-4000-afc3-c4623b84aad0.jpg'),
Path('Testing_Images/c6dd0b25-7b0f-499c-9601-13b67f37424a.jpg'),
Path('Testing_Images/c729917f-5134-4e9e-92d7-197f5a6c929c.jpg'),
Path('Testing_Images/c780b6f1-8ddf-448f-ac63-d09ce849cf4f.jpg'),
Path('Testing_Images/c78daf52-c33b-4367-a28f-5bc60a92e7d9.jpg'),
Path('Testing_Images/c7b46935-439f-47fb-8c7b-10d8079c2cda.jpg'),
Path('Testing_Images/c7e22e73-6aec-4f4c-82e3-d4a7d185e74c.jpg'),
Path('Testing_Images/c8281919-a362-4019-950d-86ab101c7e1f.jpg'),
Path('Testing_Images/c87e50e6-4ea6-4c58-99e9-de64408b8558.jpg'),
Path('Testing_Images/c88d5bf1-aac9-4d2e-87ab-c3a0d8028f12.jpg'),
Path('Testing_Images/c8aa0068-f21c-442c-9487-afbea7b126a3.jpg'),
Path('Testing_Images/c8e8c3ff-52e1-44bb-84b0-e2a90ffcfd32.jpg'),
Path('Testing_Images/c9b0cbc7-e448-4531-91e5-6d3e982e5eab.jpg'),
Path('Testing_Images/c9bfd9f3-6044-48cf-a1ed-382adb2ee4e4.jpg'),
Path('Testing_Images/c9cd4d25-9d6c-48d6-bcfe-5b1382a2065e.jpg'),
Path('Testing_Images/ca0d0a03-eed6-4081-bc60-4c8a17d518f7.jpg'),
Path('Testing_Images/ca46e843-dd19-43f1-a133-4014e1cd9882.jpg'),
Path('Testing_Images/ca95051f-e131-459b-8d26-2aec4762fbb6.jpg'),
Path('Testing_Images/cad26b43-35ef-4bdf-8244-a7fd9172f9d4.jpg'),
Path('Testing_Images/cae03a83-a903-4087-b324-a67fe98638d4.jpg'),
Path('Testing_Images/cb02fb32-08e3-4d2c-b81e-75fb77019f78.jpg'),
Path('Testing_Images/cb096c4f-2565-49b9-80e7-c6da2e2cc75c.jpg'),
Path('Testing_Images/cb207079-81e0-4025-9219-e7331cbb19cf.jpg'),
Path('Testing_Images/cb30b050-6590-4976-a43e-213dcde89f46.jpg'),
Path('Testing_Images/cb94eec0-0a33-4d52-ba03-ee9b1522d061.jpg'),
Path('Testing_Images/cb99fa14-bc99-4e77-ac25-0bf140c5dde9.jpg'),
Path('Testing_Images/cb9a5420-bbb7-4d79-bf70-d81119f96790.jpg'),
Path('Testing_Images/cbbd4c12-de6e-4bf4-b782-0b46ca63d645.jpg'),
Path('Testing_Images/cbde6d62-270e-4dd6-8b34-c968cf5791dc.jpg'),
Path('Testing_Images/cd1b5928-d462-4b44-a6c6-c85558b4b6ea.jpg'),
Path('Testing_Images/cdcecbe8-45cd-42a5-a47d-d58bc638f937.jpg'),
Path('Testing_Images/cde08a05-31aa-44f6-8fc5-c16bb94d9432.jpg'),
Path('Testing_Images/cdf941f1-2904-49c1-8952-7f30476aec72.jpg'),
Path('Testing_Images/cdf983af-3943-471a-a111-661ffdde8691.jpg'),
Path('Testing_Images/cee43c92-f71e-4e89-a9b1-cf320da0067e.jpg'),
Path('Testing_Images/cf42b86b-5320-4983-9c98-e5a1c925d3f4.jpg'),
Path('Testing_Images/cf4acaef-abef-4341-a4f2-e1689d804d3e.jpg'),
Path('Testing_Images/cf9c397a-14e3-4668-bfdd-b632a4b4e45a.jpg'),
Path('Testing_Images/cfcfaa4a-9c58-4eec-bb63-708e2b25df16.jpg'),
Path('Testing_Images/cfdf3841-5afe-4b51-8fbd-bbcbd3f233c7.jpg'),
Path('Testing_Images/cfe3d40b-4292-4de1-95b0-0c25f5048bcd.jpg'),
Path('Testing_Images/d0183e7f-16b5-4dec-9190-2141bc78f683.jpg'),
Path('Testing_Images/d01c47aa-347c-4919-a82e-b90b7d3b164f.jpg'),
Path('Testing_Images/d0445c4b-6a2a-468f-b65a-558968857adb.jpg'),
Path('Testing_Images/d057a076-1711-4c17-92ea-7cfcc2209788.jpg'),
Path('Testing_Images/d094558c-2614-4d0b-b142-a9de0864f572.jpg'),
Path('Testing_Images/d116680f-3c52-4012-aadb-ac37bb984e7b.jpg'),
Path('Testing_Images/d1368bfd-8e96-4c4b-9c83-205fc4576b7b.jpg'),
Path('Testing_Images/d18074c3-5ac2-46d4-91e4-f13f9284115f.jpg'),
Path('Testing_Images/d19c5673-e716-4a72-bebb-a593b036c99a.jpg'),
Path('Testing_Images/d1fd9bfb-f626-4fb2-bb0d-b22cd64dbd1f.jpg'),
Path('Testing_Images/d20eab1c-69ed-486e-826c-7c0b15dd61b3.jpg'),
Path('Testing_Images/d231e9fb-3282-4a1f-b0d5-2c61d7ede550.jpg'),
Path('Testing_Images/d24470a3-90c7-4640-8e60-1aa394c2f929.jpg'),
Path('Testing_Images/d258b81b-f7b4-47f7-b850-3eca636d3a8c.jpg'),
Path('Testing_Images/d2df9079-f91a-4fd7-87f2-b5de67ef2afb.jpg'),
Path('Testing_Images/d31b42c9-c5d0-4d65-978f-b75e6ff3b7d9.jpg'),
Path('Testing_Images/d35007fb-30f5-443f-a617-464ffa0ea2de.jpg'),
Path('Testing_Images/d3beb0ea-ec3c-4060-ab7c-6a7d00a3747e.jpg'),
Path('Testing_Images/d3e9cd40-d281-46dd-a115-87b9c806ca19.jpg'),
Path('Testing_Images/d3fbcb3d-b39c-4c8b-a23d-ae786dc11bfe.jpg'),
Path('Testing_Images/d44a5752-b879-421a-9a0b-1df0a238db7f.jpg'),
Path('Testing_Images/d471c394-f44a-47a4-9e41-6a651b8916cc.jpg'),
Path('Testing_Images/d4c4c58c-bfc4-4d86-b4aa-6bec4d45a85b.jpg'),
Path('Testing_Images/d4c52dfe-097d-433a-af41-b06df7998eb6.jpg'),
Path('Testing_Images/d55144f1-7b8b-424a-b460-5a7051abd301.jpg'),
Path('Testing_Images/d5523ffc-1c8a-4250-939a-3f5215397ff2.jpg'),
Path('Testing_Images/d58c4bab-ae58-4c65-81e3-0c80bcef34c6.jpg'),
Path('Testing_Images/d58ef1b3-d30e-45d5-bd21-3fdf7329a6aa.jpg'),
Path('Testing_Images/d5fab21b-cd53-4a20-9702-a58ea1978623.jpg'),
...]
for image_path in TEST_IMAGE_PATHS[:4]:
print(image_path)
show_inference(detection_model, image_path)
Testing_Images\00d7c36e-3cdf-4df6-ac03-6c30cdc8e85b.jpg output_dict[detection_classes] [ 1 1 1 65 1]
Testing_Images\0100515c-5204-4f31-98e0-f35e4b00004a.jpg output_dict[detection_classes] [1]
Testing_Images\016fd98f-09a8-4b99-a8df-c3c26992f22e.jpg output_dict[detection_classes] [ 1 65 1 28]
Testing_Images\02743848-e50a-4faf-a3f4-a6215613a23d.jpg output_dict[detection_classes] [ 1 28]
# Pickling the best classification model
# Densenet121 with ChexNet Weights gave the best classification accuracy of all models at >80%
pkl.dump(new_dense_model, open('new_dense_model_2.pkl', 'wb'))
# Pickling the best object detection model
# MobileNet and DenseNet gave around same IoU, but MobileNet is much faster in execution and prediction
pkl.dump(base_model1, open('mobilenet_objdet', 'wb'))
******************** Pickling of models done ********************
test_submission = pd.read_csv("stage_2_sample_submission.csv")
test_submission.sample(5).style.set_properties(**{'border': '1.3px solid black','color': 'blue'})
| patientId | PredictionString | |
|---|---|---|
| 1222 | 1bb7ca9c-c1ea-4054-93ee-44009e6cc119 | 0.5 0 0 100 100 |
| 289 | 04176263-de7f-471b-9dc1-7a5af4e6d943 | 0.5 0 0 100 100 |
| 1860 | 243847a8-928b-4fb9-a7e5-20ec95e0445e | 0.5 0 0 100 100 |
| 1388 | 1e3497cb-00c5-4333-9d35-a27fc9a49829 | 0.5 0 0 100 100 |
| 2865 | c00a132f-6108-46e7-a67a-77efd3d2d0ef | 0.5 0 0 100 100 |
#Loading the Test Dicom images
items_test = get_dicom_files('stage_2_test_images/')
items_test
items_test[2]
len(items_test)
sample_test1 = 1
xray_sample_1 = items_test[sample_test1].dcmread()
xray_sample_1
dicom_test = pd.DataFrame.from_dicoms(items_test)
dicom_test
dicom_test.dtypes
SpecificCharacterSet object SOPClassUID object SOPInstanceUID object StudyDate object StudyTime object AccessionNumber object Modality object ConversionType object ReferringPhysicianName object SeriesDescription object PatientName object PatientID object PatientBirthDate object PatientSex object PatientAge object BodyPartExamined object ViewPosition object StudyInstanceUID object SeriesInstanceUID object StudyID object SeriesNumber int64 InstanceNumber int64 PatientOrientation object SamplesPerPixel int64 PhotometricInterpretation object Rows int64 Columns int64 MultiPixelSpacing int64 PixelSpacing float64 PixelSpacing1 float64 BitsAllocated int64 BitsStored int64 HighBit int64 PixelRepresentation int64 LossyImageCompression object LossyImageCompressionMethod object fname object img_min int8 img_max int8 img_mean float64 img_std float64 img_pct_window float64 dtype: object
dicom_test.to_csv("dicom_test.csv", index = False)
def test_submission_csv(pass_model,data):
passed_sample = data
test_submission_pred = pd.DataFrame()
for loop_1 in passed_sample:
pat_id_1 = loop_1
fname = pat_id_1+".dcm"
image_path = "stage_2_test_images/"+fname
ds = dicom.dcmread(image_path)
img = ds.pixel_array
img_for_prediction =img[tf.newaxis,...]
#img_pred_128 = np.resize(img,(img_for_prediction.shape[0],128,128,3))
img_pred_region = pass_model.predict(img_for_prediction) # Predict the BBox
x0 = (img_pred_region[0,0])
y0 = (img_pred_region[0,1])
x1 = (img_pred_region[0,2])
y1 = (img_pred_region[0,3])
test_submission_pred = test_submission_pred.append([pat_id_1,x0,y0,x1,y1],ignore_index = True)
test_submission.columns = ["patient_id","X","Y","Width","Height"]
return test_submission.columns
dicom_test.columns
Index(['SpecificCharacterSet', 'SOPClassUID', 'SOPInstanceUID', 'StudyDate',
'StudyTime', 'AccessionNumber', 'Modality', 'ConversionType',
'ReferringPhysicianName', 'SeriesDescription', 'PatientName',
'PatientID', 'PatientBirthDate', 'PatientSex', 'PatientAge',
'BodyPartExamined', 'ViewPosition', 'StudyInstanceUID',
'SeriesInstanceUID', 'StudyID', 'SeriesNumber', 'InstanceNumber',
'PatientOrientation', 'SamplesPerPixel', 'PhotometricInterpretation',
'Rows', 'Columns', 'MultiPixelSpacing', 'PixelSpacing', 'PixelSpacing1',
'BitsAllocated', 'BitsStored', 'HighBit', 'PixelRepresentation',
'LossyImageCompression', 'LossyImageCompressionMethod', 'fname',
'img_min', 'img_max', 'img_mean', 'img_std', 'img_pct_window'],
dtype='object')
dicom_test_final = pd.DataFrame(data = dicom_test['PatientID'],columns = ["PatientID"])
dicom_test_final["Image_file"] = dicom_test_final["PatientID"] + ".dcm"
dicom_test_final.sample(5)
| PatientID | Image_file | |
|---|---|---|
| 375 | 0550c5d5-941c-4995-9276-6401a8a4ffbe | 0550c5d5-941c-4995-9276-6401a8a4ffbe.dcm |
| 1535 | 1fe97b7f-62e1-4b0c-84b4-d41486039f01 | 1fe97b7f-62e1-4b0c-84b4-d41486039f01.dcm |
| 1962 | 2598af1b-4bfa-470c-95d5-3ba472f033b6 | 2598af1b-4bfa-470c-95d5-3ba472f033b6.dcm |
| 1170 | 1b06bd36-85c7-4495-b0a8-a9aacc29be17 | 1b06bd36-85c7-4495-b0a8-a9aacc29be17.dcm |
| 2740 | 2ff6f390-1e4c-489e-91ac-6b18448a08db | 2ff6f390-1e4c-489e-91ac-6b18448a08db.dcm |
dicom_test_final["Full_filename"] = "stage_2_test_images"+"/"+dicom_test_final["Image_file"]
dicom_test_final.sample(5).style.set_properties(**{'border': '1.3px solid black','color': 'blue'}).hide_index()
| PatientID | Image_file | Full_filename |
|---|---|---|
| 25e3c135-1274-407c-b34d-643460fbf617 | 25e3c135-1274-407c-b34d-643460fbf617.dcm | stage_2_test_images/25e3c135-1274-407c-b34d-643460fbf617.dcm |
| 140e2cbb-3fc3-4a5d-830d-1709d1745efe | 140e2cbb-3fc3-4a5d-830d-1709d1745efe.dcm | stage_2_test_images/140e2cbb-3fc3-4a5d-830d-1709d1745efe.dcm |
| 19b8517b-24e4-457b-96c0-a457e30125bb | 19b8517b-24e4-457b-96c0-a457e30125bb.dcm | stage_2_test_images/19b8517b-24e4-457b-96c0-a457e30125bb.dcm |
| 0d000250-5968-40f7-bcab-8b6cffd8ccb3 | 0d000250-5968-40f7-bcab-8b6cffd8ccb3.dcm | stage_2_test_images/0d000250-5968-40f7-bcab-8b6cffd8ccb3.dcm |
| 004b28ac-d188-427b-8a21-10501125f596 | 004b28ac-d188-427b-8a21-10501125f596.dcm | stage_2_test_images/004b28ac-d188-427b-8a21-10501125f596.dcm |
# Print images randomly from test images
def show_images(passed_value):
show_df = dicom_test_final.sample(n=passed_value)
if passed_value/5 == 0:
z = math.ceil(passed_value/5)
else:
z = math.ceil((passed_value)/5)
fig, ax = plt.subplots(z,5,constrained_layout=True)
fig.set_figheight(20)
fig.set_figwidth(20)
i = 0
j = 0
for index, row in show_df.iterrows():
image_path = row['Full_filename']
ds = dicom.dcmread(image_path)
img = ds.pixel_array
height_1 = 224
width_1 = 224
dim = (width_1, height_1)
img_resize = cv2.resize(img, dim, interpolation=cv2.INTER_LINEAR)
ax[i][j].imshow(img_resize, cmap=plt.cm.bone)
j = j+1
if j == 5:
i = i+1
j = 0
plt.figure().clear()
plt.show()
return(show_df)
show_images(20).style.set_properties(**{'border': '1.3px solid black','color': 'blue'})
<Figure size 640x480 with 0 Axes>
| PatientID | Image_file | Full_filename | |
|---|---|---|---|
| 244 | 037d1b26-3dc7-48c2-b93c-214e7538c76d | 037d1b26-3dc7-48c2-b93c-214e7538c76d.dcm | stage_2_test_images/037d1b26-3dc7-48c2-b93c-214e7538c76d.dcm |
| 2540 | 2d682b94-2c9d-4bd3-8b59-bbf22616beab | 2d682b94-2c9d-4bd3-8b59-bbf22616beab.dcm | stage_2_test_images/2d682b94-2c9d-4bd3-8b59-bbf22616beab.dcm |
| 2144 | 282c7cf6-713a-45d3-921f-0a15c9e5ec1f | 282c7cf6-713a-45d3-921f-0a15c9e5ec1f.dcm | stage_2_test_images/282c7cf6-713a-45d3-921f-0a15c9e5ec1f.dcm |
| 1561 | 20214557-cbc9-45ec-a162-15b8ee795702 | 20214557-cbc9-45ec-a162-15b8ee795702.dcm | stage_2_test_images/20214557-cbc9-45ec-a162-15b8ee795702.dcm |
| 327 | 04785916-0ae2-40b9-bd71-1698025326fa | 04785916-0ae2-40b9-bd71-1698025326fa.dcm | stage_2_test_images/04785916-0ae2-40b9-bd71-1698025326fa.dcm |
| 1143 | 1ad0b486-8d69-4573-9303-f0f7452676b3 | 1ad0b486-8d69-4573-9303-f0f7452676b3.dcm | stage_2_test_images/1ad0b486-8d69-4573-9303-f0f7452676b3.dcm |
| 1874 | 245beb7f-d20b-49eb-8403-09360ec9f163 | 245beb7f-d20b-49eb-8403-09360ec9f163.dcm | stage_2_test_images/245beb7f-d20b-49eb-8403-09360ec9f163.dcm |
| 301 | 043d020e-c61b-40ac-8704-be5ed7ded293 | 043d020e-c61b-40ac-8704-be5ed7ded293.dcm | stage_2_test_images/043d020e-c61b-40ac-8704-be5ed7ded293.dcm |
| 2380 | 2b76e6cb-fe51-40da-bfc6-54253dee41f6 | 2b76e6cb-fe51-40da-bfc6-54253dee41f6.dcm | stage_2_test_images/2b76e6cb-fe51-40da-bfc6-54253dee41f6.dcm |
| 129 | 018cb7c3-0ec3-484d-a6a3-bd7c376605d1 | 018cb7c3-0ec3-484d-a6a3-bd7c376605d1.dcm | stage_2_test_images/018cb7c3-0ec3-484d-a6a3-bd7c376605d1.dcm |
| 566 | 0f418437-6440-4145-a7f7-b254a91f5240 | 0f418437-6440-4145-a7f7-b254a91f5240.dcm | stage_2_test_images/0f418437-6440-4145-a7f7-b254a91f5240.dcm |
| 1714 | 22669933-5680-4544-a22d-08f7875d2d3f | 22669933-5680-4544-a22d-08f7875d2d3f.dcm | stage_2_test_images/22669933-5680-4544-a22d-08f7875d2d3f.dcm |
| 1269 | 1c766749-0a34-431a-929b-06a3f0c6752d | 1c766749-0a34-431a-929b-06a3f0c6752d.dcm | stage_2_test_images/1c766749-0a34-431a-929b-06a3f0c6752d.dcm |
| 252 | 0385be2c-896e-489d-8b09-bac5694ce5ed | 0385be2c-896e-489d-8b09-bac5694ce5ed.dcm | stage_2_test_images/0385be2c-896e-489d-8b09-bac5694ce5ed.dcm |
| 406 | 05b0b6e5-e66c-4bcf-8516-4f7c052651d3 | 05b0b6e5-e66c-4bcf-8516-4f7c052651d3.dcm | stage_2_test_images/05b0b6e5-e66c-4bcf-8516-4f7c052651d3.dcm |
| 2894 | c0662eac-8a07-4ba1-9c8a-e4d15c3ac067 | c0662eac-8a07-4ba1-9c8a-e4d15c3ac067.dcm | stage_2_test_images/c0662eac-8a07-4ba1-9c8a-e4d15c3ac067.dcm |
| 1688 | 21ea97d1-c6f8-460a-8487-8530b4fcef02 | 21ea97d1-c6f8-460a-8487-8530b4fcef02.dcm | stage_2_test_images/21ea97d1-c6f8-460a-8487-8530b4fcef02.dcm |
| 1971 | 25b8d29b-958e-45a5-8b30-1f56404aea94 | 25b8d29b-958e-45a5-8b30-1f56404aea94.dcm | stage_2_test_images/25b8d29b-958e-45a5-8b30-1f56404aea94.dcm |
| 906 | 1369f719-69c9-40d8-9ce5-a54a3773cdfa | 1369f719-69c9-40d8-9ce5-a54a3773cdfa.dcm | stage_2_test_images/1369f719-69c9-40d8-9ce5-a54a3773cdfa.dcm |
| 2302 | 2a731e8d-4b28-46ec-b22d-01cbcbf67d12 | 2a731e8d-4b28-46ec-b22d-01cbcbf67d12.dcm | stage_2_test_images/2a731e8d-4b28-46ec-b22d-01cbcbf67d12.dcm |
img_rows=224
img_cols=224
dim = (img_rows, img_cols)
X_target = []
brk = 0
i = 1 # initialisation
for img in tqdm(dicom_test_final["Full_filename"].values):
ds_3 = dicom.dcmread(img)
img_3 = ds_3.pixel_array
rgb = apply_color_lut(img_3, palette='PET')
train_img = rgb
try:
train_img_resize = cv2.resize(train_img, dim, interpolation=cv2.INTER_LINEAR)
except:
brk +=1
print("breaking out for",img)
break
height_2, width_2, layers = train_img_resize.shape
size=(width_2,height_2)
X_target.append(train_img_resize)
i += 1
100%|██████████████████████████████████████████████████████████████████████████████| 3000/3000 [01:12<00:00, 41.58it/s]
fileName = "X_test_submission.pkl"
fileObject = open(fileName, 'wb')
pkl.dump(X_target, fileObject)
fileObject.close()
dicom_test_final.to_pickle("dicom_test_final.pkl") # pickling the dicom test final data
X_target_test = np.array(X_target)
def test_submission_csv(pass_model,data):
passed_sample = data
test_submission_pred = pd.DataFrame()
for loop_1 in passed_sample:
pat_id_1 = loop_1
fname = pat_id_1+".dcm"
image_path = "stage_2_test_images/"+fname
ds = dicom.dcmread(image_path)
img = ds.pixel_array
img_for_prediction =img[tf.newaxis,...]
img_pred_128 = np.resize(img,(img_for_prediction.shape[0],128,128,3))
img_pred_region = pass_model.predict(img_pred_128) # Predict the BBox
x0 = (img_pred_region[0,0])
y0 = (img_pred_region[0,1])
x1 = (img_pred_region[0,2])
y1 = (img_pred_region[0,3])
row = pd.Series([pat_id_1,x0*8,y0*8,x1*8,y1*8])
test_submission_pred = test_submission_pred.append(row, ignore_index = True)
return test_submission_pred
Final_Test_Submission_CSV = test_submission_csv(base_model1,dicom_test_final['PatientID'])
1/1 [==============================] - 0s 34ms/step 1/1 [==============================] - 0s 34ms/step 1/1 [==============================] - 0s 29ms/step 1/1 [==============================] - 0s 29ms/step 1/1 [==============================] - 0s 50ms/step 1/1 [==============================] - 0s 35ms/step 1/1 [==============================] - 0s 28ms/step 1/1 [==============================] - 0s 43ms/step 1/1 [==============================] - 0s 43ms/step 1/1 [==============================] - 0s 29ms/step 1/1 [==============================] - 0s 28ms/step 1/1 [==============================] - 0s 31ms/step 1/1 [==============================] - 0s 26ms/step 1/1 [==============================] - 0s 29ms/step 1/1 [==============================] - 0s 35ms/step 1/1 [==============================] - 0s 25ms/step 1/1 [==============================] - 0s 36ms/step 1/1 [==============================] - 0s 31ms/step 1/1 [==============================] - 0s 32ms/step 1/1 [==============================] - 0s 28ms/step 1/1 [==============================] - 0s 29ms/step 1/1 [==============================] - 0s 37ms/step 1/1 [==============================] - 0s 34ms/step 1/1 [==============================] - 0s 32ms/step 1/1 [==============================] - 0s 34ms/step 1/1 [==============================] - 0s 42ms/step 1/1 [==============================] - 0s 29ms/step 1/1 [==============================] - 0s 34ms/step 1/1 [==============================] - 0s 44ms/step 1/1 [==============================] - 0s 28ms/step 1/1 [==============================] - 0s 31ms/step 1/1 [==============================] - 0s 26ms/step 1/1 [==============================] - 0s 30ms/step 1/1 [==============================] - 0s 27ms/step 1/1 [==============================] - 0s 29ms/step 1/1 [==============================] - 0s 36ms/step 1/1 [==============================] - 0s 34ms/step 1/1 [==============================] - 0s 26ms/step 1/1 [==============================] - 0s 35ms/step 1/1 [==============================] - 0s 26ms/step 1/1 [==============================] - 0s 28ms/step 1/1 [==============================] - 0s 45ms/step 1/1 [==============================] - 0s 29ms/step 1/1 [==============================] - 0s 35ms/step 1/1 [==============================] - 0s 24ms/step 1/1 [==============================] - 0s 32ms/step 1/1 [==============================] - 0s 24ms/step 1/1 [==============================] - 0s 29ms/step 1/1 [==============================] - 0s 26ms/step 1/1 [==============================] - 0s 45ms/step 1/1 [==============================] - 0s 28ms/step 1/1 [==============================] - 0s 32ms/step 1/1 [==============================] - 0s 28ms/step 1/1 [==============================] - 0s 27ms/step 1/1 [==============================] - 0s 28ms/step 1/1 [==============================] - 0s 28ms/step 1/1 [==============================] - 0s 24ms/step 1/1 [==============================] - 0s 32ms/step 1/1 [==============================] - 0s 27ms/step 1/1 [==============================] - 0s 34ms/step 1/1 [==============================] - 0s 33ms/step 1/1 [==============================] - 0s 33ms/step 1/1 [==============================] - 0s 27ms/step 1/1 [==============================] - 0s 27ms/step 1/1 [==============================] - 0s 25ms/step 1/1 [==============================] - 0s 25ms/step 1/1 [==============================] - 0s 40ms/step 1/1 [==============================] - 0s 27ms/step 1/1 [==============================] - 0s 25ms/step 1/1 [==============================] - 0s 32ms/step 1/1 [==============================] - 0s 36ms/step 1/1 [==============================] - 0s 33ms/step 1/1 [==============================] - 0s 29ms/step 1/1 [==============================] - 0s 26ms/step 1/1 [==============================] - 0s 36ms/step 1/1 [==============================] - 0s 30ms/step 1/1 [==============================] - 0s 27ms/step 1/1 [==============================] - 0s 31ms/step 1/1 [==============================] - 0s 28ms/step 1/1 [==============================] - 0s 25ms/step 1/1 [==============================] - 0s 55ms/step 1/1 [==============================] - 0s 25ms/step 1/1 [==============================] - 0s 28ms/step 1/1 [==============================] - 0s 28ms/step 1/1 [==============================] - 0s 27ms/step 1/1 [==============================] - 0s 34ms/step 1/1 [==============================] - 0s 43ms/step 1/1 [==============================] - 0s 33ms/step 1/1 [==============================] - 0s 37ms/step 1/1 [==============================] - 0s 27ms/step 1/1 [==============================] - 0s 25ms/step 1/1 [==============================] - 0s 28ms/step 1/1 [==============================] - 0s 26ms/step 1/1 [==============================] - 0s 26ms/step 1/1 [==============================] - 0s 25ms/step 1/1 [==============================] - 0s 37ms/step 1/1 [==============================] - 0s 25ms/step 1/1 [==============================] - 0s 26ms/step 1/1 [==============================] - 0s 39ms/step 1/1 [==============================] - 0s 44ms/step 1/1 [==============================] - 0s 25ms/step 1/1 [==============================] - 0s 34ms/step 1/1 [==============================] - 0s 26ms/step 1/1 [==============================] - 0s 27ms/step 1/1 [==============================] - 0s 24ms/step 1/1 [==============================] - 0s 26ms/step 1/1 [==============================] - 0s 24ms/step 1/1 [==============================] - 0s 25ms/step 1/1 [==============================] - 0s 27ms/step 1/1 [==============================] - 0s 26ms/step 1/1 [==============================] - 0s 36ms/step 1/1 [==============================] - 0s 28ms/step 1/1 [==============================] - 0s 28ms/step 1/1 [==============================] - 0s 26ms/step 1/1 [==============================] - 0s 29ms/step 1/1 [==============================] - 0s 25ms/step 1/1 [==============================] - 0s 58ms/step 1/1 [==============================] - 0s 25ms/step 1/1 [==============================] - 0s 31ms/step 1/1 [==============================] - 0s 26ms/step 1/1 [==============================] - 0s 26ms/step 1/1 [==============================] - 0s 39ms/step 1/1 [==============================] - 0s 23ms/step 1/1 [==============================] - 0s 27ms/step 1/1 [==============================] - 0s 55ms/step 1/1 [==============================] - 0s 25ms/step 1/1 [==============================] - 0s 31ms/step 1/1 [==============================] - 0s 25ms/step 1/1 [==============================] - 0s 33ms/step 1/1 [==============================] - 0s 24ms/step 1/1 [==============================] - 0s 23ms/step 1/1 [==============================] - 0s 25ms/step 1/1 [==============================] - 0s 26ms/step 1/1 [==============================] - 0s 29ms/step 1/1 [==============================] - 0s 33ms/step 1/1 [==============================] - 0s 26ms/step 1/1 [==============================] - 0s 36ms/step 1/1 [==============================] - 0s 24ms/step 1/1 [==============================] - 0s 29ms/step 1/1 [==============================] - 0s 27ms/step 1/1 [==============================] - 0s 26ms/step 1/1 [==============================] - 0s 27ms/step 1/1 [==============================] - 0s 27ms/step 1/1 [==============================] - 0s 28ms/step 1/1 [==============================] - 0s 24ms/step 1/1 [==============================] - 0s 26ms/step 1/1 [==============================] - 0s 25ms/step 1/1 [==============================] - 0s 26ms/step 1/1 [==============================] - 0s 35ms/step 1/1 [==============================] - 0s 26ms/step 1/1 [==============================] - 0s 26ms/step 1/1 [==============================] - 0s 27ms/step 1/1 [==============================] - 0s 27ms/step 1/1 [==============================] - 0s 24ms/step 1/1 [==============================] - 0s 44ms/step 1/1 [==============================] - 0s 24ms/step 1/1 [==============================] - 0s 24ms/step 1/1 [==============================] - 0s 26ms/step 1/1 [==============================] - 0s 26ms/step 1/1 [==============================] - 0s 24ms/step 1/1 [==============================] - 0s 25ms/step 1/1 [==============================] - 0s 27ms/step 1/1 [==============================] - 0s 27ms/step 1/1 [==============================] - 0s 26ms/step 1/1 [==============================] - 0s 25ms/step 1/1 [==============================] - 0s 25ms/step 1/1 [==============================] - 0s 41ms/step 1/1 [==============================] - 0s 28ms/step 1/1 [==============================] - 0s 25ms/step 1/1 [==============================] - 0s 30ms/step 1/1 [==============================] - 0s 27ms/step 1/1 [==============================] - 0s 46ms/step 1/1 [==============================] - 0s 52ms/step 1/1 [==============================] - 0s 37ms/step 1/1 [==============================] - 0s 35ms/step 1/1 [==============================] - 0s 26ms/step 1/1 [==============================] - 0s 28ms/step 1/1 [==============================] - 0s 39ms/step 1/1 [==============================] - 0s 28ms/step 1/1 [==============================] - 0s 28ms/step 1/1 [==============================] - 0s 25ms/step 1/1 [==============================] - 0s 26ms/step 1/1 [==============================] - 0s 26ms/step 1/1 [==============================] - 0s 29ms/step 1/1 [==============================] - 0s 26ms/step 1/1 [==============================] - 0s 29ms/step 1/1 [==============================] - 0s 25ms/step 1/1 [==============================] - 0s 24ms/step 1/1 [==============================] - 0s 24ms/step 1/1 [==============================] - 0s 28ms/step 1/1 [==============================] - 0s 28ms/step 1/1 [==============================] - 0s 24ms/step 1/1 [==============================] - 0s 27ms/step 1/1 [==============================] - 0s 27ms/step 1/1 [==============================] - 0s 32ms/step 1/1 [==============================] - 0s 24ms/step 1/1 [==============================] - 0s 25ms/step 1/1 [==============================] - 0s 27ms/step 1/1 [==============================] - 0s 25ms/step 1/1 [==============================] - 0s 28ms/step 1/1 [==============================] - 0s 26ms/step 1/1 [==============================] - 0s 23ms/step 1/1 [==============================] - 0s 24ms/step 1/1 [==============================] - 0s 24ms/step 1/1 [==============================] - 0s 36ms/step 1/1 [==============================] - 0s 26ms/step 1/1 [==============================] - 0s 23ms/step 1/1 [==============================] - 0s 23ms/step 1/1 [==============================] - 0s 26ms/step 1/1 [==============================] - 0s 25ms/step 1/1 [==============================] - 0s 25ms/step 1/1 [==============================] - 0s 23ms/step 1/1 [==============================] - 0s 26ms/step 1/1 [==============================] - 0s 26ms/step 1/1 [==============================] - 0s 29ms/step 1/1 [==============================] - 0s 28ms/step 1/1 [==============================] - 0s 23ms/step 1/1 [==============================] - 0s 23ms/step 1/1 [==============================] - 0s 27ms/step 1/1 [==============================] - 0s 27ms/step 1/1 [==============================] - 0s 27ms/step 1/1 [==============================] - 0s 34ms/step 1/1 [==============================] - 0s 29ms/step 1/1 [==============================] - 0s 24ms/step 1/1 [==============================] - 0s 23ms/step 1/1 [==============================] - 0s 27ms/step 1/1 [==============================] - 0s 30ms/step 1/1 [==============================] - 0s 26ms/step 1/1 [==============================] - 0s 24ms/step 1/1 [==============================] - 0s 25ms/step 1/1 [==============================] - 0s 25ms/step 1/1 [==============================] - 0s 23ms/step 1/1 [==============================] - 0s 30ms/step 1/1 [==============================] - 0s 22ms/step 1/1 [==============================] - 0s 28ms/step 1/1 [==============================] - 0s 25ms/step 1/1 [==============================] - 0s 24ms/step 1/1 [==============================] - 0s 23ms/step 1/1 [==============================] - 0s 23ms/step 1/1 [==============================] - 0s 27ms/step 1/1 [==============================] - 0s 24ms/step 1/1 [==============================] - 0s 38ms/step 1/1 [==============================] - 0s 31ms/step 1/1 [==============================] - 0s 24ms/step 1/1 [==============================] - 0s 24ms/step 1/1 [==============================] - 0s 27ms/step 1/1 [==============================] - 0s 29ms/step 1/1 [==============================] - 0s 24ms/step 1/1 [==============================] - 0s 25ms/step 1/1 [==============================] - 0s 24ms/step 1/1 [==============================] - 0s 28ms/step 1/1 [==============================] - 0s 25ms/step 1/1 [==============================] - 0s 30ms/step 1/1 [==============================] - 0s 23ms/step 1/1 [==============================] - 0s 25ms/step 1/1 [==============================] - 0s 25ms/step 1/1 [==============================] - 0s 24ms/step 1/1 [==============================] - 0s 26ms/step 1/1 [==============================] - 0s 21ms/step 1/1 [==============================] - 0s 25ms/step 1/1 [==============================] - 0s 22ms/step 1/1 [==============================] - 0s 24ms/step 1/1 [==============================] - 0s 24ms/step 1/1 [==============================] - 0s 25ms/step 1/1 [==============================] - 0s 30ms/step 1/1 [==============================] - 0s 23ms/step 1/1 [==============================] - 0s 23ms/step 1/1 [==============================] - 0s 23ms/step 1/1 [==============================] - 0s 23ms/step 1/1 [==============================] - 0s 24ms/step 1/1 [==============================] - 0s 25ms/step 1/1 [==============================] - 0s 35ms/step 1/1 [==============================] - 0s 24ms/step 1/1 [==============================] - 0s 37ms/step 1/1 [==============================] - 0s 25ms/step 1/1 [==============================] - 0s 29ms/step 1/1 [==============================] - 0s 26ms/step 1/1 [==============================] - 0s 21ms/step 1/1 [==============================] - 0s 23ms/step 1/1 [==============================] - 0s 24ms/step 1/1 [==============================] - 0s 25ms/step 1/1 [==============================] - 0s 26ms/step 1/1 [==============================] - 0s 24ms/step 1/1 [==============================] - 0s 40ms/step 1/1 [==============================] - 0s 23ms/step 1/1 [==============================] - 0s 24ms/step 1/1 [==============================] - 0s 25ms/step 1/1 [==============================] - 0s 21ms/step 1/1 [==============================] - 0s 37ms/step 1/1 [==============================] - 0s 24ms/step 1/1 [==============================] - 0s 22ms/step 1/1 [==============================] - 0s 24ms/step 1/1 [==============================] - 0s 25ms/step 1/1 [==============================] - 0s 25ms/step 1/1 [==============================] - 0s 24ms/step 1/1 [==============================] - 0s 24ms/step 1/1 [==============================] - 0s 24ms/step 1/1 [==============================] - 0s 23ms/step 1/1 [==============================] - 0s 26ms/step 1/1 [==============================] - 0s 25ms/step 1/1 [==============================] - 0s 30ms/step 1/1 [==============================] - 0s 26ms/step 1/1 [==============================] - 0s 24ms/step 1/1 [==============================] - 0s 24ms/step 1/1 [==============================] - 0s 38ms/step 1/1 [==============================] - 0s 21ms/step 1/1 [==============================] - 0s 29ms/step 1/1 [==============================] - 0s 21ms/step 1/1 [==============================] - 0s 23ms/step 1/1 [==============================] - 0s 23ms/step 1/1 [==============================] - 0s 24ms/step 1/1 [==============================] - 0s 22ms/step 1/1 [==============================] - 0s 23ms/step 1/1 [==============================] - 0s 22ms/step 1/1 [==============================] - 0s 22ms/step 1/1 [==============================] - 0s 50ms/step 1/1 [==============================] - 0s 24ms/step 1/1 [==============================] - 0s 22ms/step 1/1 [==============================] - 0s 24ms/step 1/1 [==============================] - 0s 33ms/step 1/1 [==============================] - 0s 25ms/step 1/1 [==============================] - 0s 25ms/step 1/1 [==============================] - 0s 24ms/step 1/1 [==============================] - 0s 22ms/step 1/1 [==============================] - 0s 25ms/step 1/1 [==============================] - 0s 23ms/step 1/1 [==============================] - 0s 31ms/step 1/1 [==============================] - 0s 32ms/step 1/1 [==============================] - 0s 22ms/step 1/1 [==============================] - 0s 23ms/step 1/1 [==============================] - 0s 32ms/step 1/1 [==============================] - 0s 22ms/step 1/1 [==============================] - 0s 23ms/step 1/1 [==============================] - 0s 24ms/step 1/1 [==============================] - 0s 25ms/step 1/1 [==============================] - 0s 23ms/step 1/1 [==============================] - 0s 23ms/step 1/1 [==============================] - 0s 29ms/step 1/1 [==============================] - 0s 23ms/step 1/1 [==============================] - 0s 23ms/step 1/1 [==============================] - 0s 26ms/step 1/1 [==============================] - 0s 23ms/step 1/1 [==============================] - 0s 22ms/step 1/1 [==============================] - 0s 25ms/step 1/1 [==============================] - 0s 22ms/step 1/1 [==============================] - 0s 24ms/step 1/1 [==============================] - 0s 23ms/step 1/1 [==============================] - 0s 24ms/step 1/1 [==============================] - 0s 23ms/step 1/1 [==============================] - 0s 24ms/step 1/1 [==============================] - 0s 24ms/step 1/1 [==============================] - 0s 35ms/step 1/1 [==============================] - 0s 28ms/step 1/1 [==============================] - 0s 23ms/step 1/1 [==============================] - 0s 30ms/step 1/1 [==============================] - 0s 23ms/step 1/1 [==============================] - 0s 23ms/step 1/1 [==============================] - 0s 27ms/step 1/1 [==============================] - 0s 30ms/step 1/1 [==============================] - 0s 37ms/step 1/1 [==============================] - 0s 36ms/step 1/1 [==============================] - 0s 27ms/step 1/1 [==============================] - 0s 22ms/step 1/1 [==============================] - 0s 22ms/step 1/1 [==============================] - 0s 24ms/step 1/1 [==============================] - 0s 23ms/step 1/1 [==============================] - 0s 46ms/step 1/1 [==============================] - 0s 26ms/step 1/1 [==============================] - 0s 22ms/step 1/1 [==============================] - 0s 24ms/step 1/1 [==============================] - 0s 22ms/step 1/1 [==============================] - 0s 22ms/step 1/1 [==============================] - 0s 22ms/step 1/1 [==============================] - 0s 23ms/step 1/1 [==============================] - 0s 33ms/step 1/1 [==============================] - 0s 24ms/step 1/1 [==============================] - 0s 27ms/step 1/1 [==============================] - 0s 23ms/step 1/1 [==============================] - 0s 24ms/step 1/1 [==============================] - 0s 23ms/step 1/1 [==============================] - 0s 24ms/step 1/1 [==============================] - 0s 25ms/step 1/1 [==============================] - 0s 24ms/step 1/1 [==============================] - 0s 23ms/step 1/1 [==============================] - 0s 32ms/step 1/1 [==============================] - 0s 23ms/step 1/1 [==============================] - 0s 36ms/step 1/1 [==============================] - 0s 24ms/step 1/1 [==============================] - 0s 26ms/step 1/1 [==============================] - 0s 26ms/step 1/1 [==============================] - 0s 21ms/step 1/1 [==============================] - 0s 24ms/step 1/1 [==============================] - 0s 23ms/step 1/1 [==============================] - 0s 23ms/step 1/1 [==============================] - 0s 27ms/step 1/1 [==============================] - 0s 25ms/step 1/1 [==============================] - 0s 25ms/step 1/1 [==============================] - 0s 23ms/step 1/1 [==============================] - 0s 28ms/step 1/1 [==============================] - 0s 24ms/step 1/1 [==============================] - 0s 32ms/step 1/1 [==============================] - 0s 24ms/step 1/1 [==============================] - 0s 24ms/step 1/1 [==============================] - 0s 25ms/step 1/1 [==============================] - 0s 23ms/step 1/1 [==============================] - 0s 24ms/step 1/1 [==============================] - 0s 29ms/step 1/1 [==============================] - 0s 24ms/step 1/1 [==============================] - 0s 22ms/step 1/1 [==============================] - 0s 23ms/step 1/1 [==============================] - 0s 35ms/step 1/1 [==============================] - 0s 22ms/step 1/1 [==============================] - 0s 22ms/step 1/1 [==============================] - 0s 23ms/step 1/1 [==============================] - 0s 23ms/step 1/1 [==============================] - 0s 23ms/step 1/1 [==============================] - 0s 23ms/step 1/1 [==============================] - 0s 24ms/step 1/1 [==============================] - 0s 31ms/step 1/1 [==============================] - 0s 23ms/step 1/1 [==============================] - 0s 24ms/step 1/1 [==============================] - 0s 31ms/step 1/1 [==============================] - 0s 22ms/step 1/1 [==============================] - 0s 27ms/step 1/1 [==============================] - 0s 24ms/step 1/1 [==============================] - 0s 29ms/step 1/1 [==============================] - 0s 23ms/step 1/1 [==============================] - 0s 22ms/step 1/1 [==============================] - 0s 29ms/step 1/1 [==============================] - 0s 23ms/step 1/1 [==============================] - 0s 27ms/step 1/1 [==============================] - 0s 23ms/step 1/1 [==============================] - 0s 22ms/step 1/1 [==============================] - 0s 26ms/step 1/1 [==============================] - 0s 22ms/step 1/1 [==============================] - 0s 23ms/step 1/1 [==============================] - 0s 23ms/step 1/1 [==============================] - 0s 23ms/step 1/1 [==============================] - 0s 28ms/step 1/1 [==============================] - 0s 26ms/step 1/1 [==============================] - 0s 23ms/step 1/1 [==============================] - 0s 25ms/step 1/1 [==============================] - 0s 22ms/step 1/1 [==============================] - 0s 26ms/step 1/1 [==============================] - 0s 24ms/step 1/1 [==============================] - 0s 22ms/step 1/1 [==============================] - 0s 29ms/step 1/1 [==============================] - 0s 22ms/step 1/1 [==============================] - 0s 24ms/step 1/1 [==============================] - 0s 22ms/step 1/1 [==============================] - 0s 23ms/step 1/1 [==============================] - 0s 24ms/step 1/1 [==============================] - 0s 23ms/step 1/1 [==============================] - 0s 22ms/step 1/1 [==============================] - 0s 24ms/step 1/1 [==============================] - 0s 22ms/step 1/1 [==============================] - 0s 22ms/step 1/1 [==============================] - 0s 34ms/step 1/1 [==============================] - 0s 27ms/step 1/1 [==============================] - 0s 22ms/step 1/1 [==============================] - 0s 23ms/step 1/1 [==============================] - 0s 23ms/step 1/1 [==============================] - 0s 21ms/step 1/1 [==============================] - 0s 21ms/step 1/1 [==============================] - 0s 23ms/step 1/1 [==============================] - 0s 27ms/step 1/1 [==============================] - 0s 24ms/step 1/1 [==============================] - 0s 22ms/step 1/1 [==============================] - 0s 21ms/step 1/1 [==============================] - 0s 22ms/step 1/1 [==============================] - 0s 24ms/step 1/1 [==============================] - 0s 27ms/step 1/1 [==============================] - 0s 26ms/step 1/1 [==============================] - 0s 24ms/step 1/1 [==============================] - 0s 21ms/step 1/1 [==============================] - 0s 29ms/step 1/1 [==============================] - 0s 22ms/step 1/1 [==============================] - 0s 33ms/step 1/1 [==============================] - 0s 25ms/step 1/1 [==============================] - 0s 23ms/step 1/1 [==============================] - 0s 25ms/step 1/1 [==============================] - 0s 31ms/step 1/1 [==============================] - 0s 21ms/step 1/1 [==============================] - 0s 26ms/step 1/1 [==============================] - 0s 28ms/step 1/1 [==============================] - 0s 22ms/step 1/1 [==============================] - 0s 23ms/step 1/1 [==============================] - 0s 22ms/step 1/1 [==============================] - 0s 27ms/step 1/1 [==============================] - 0s 22ms/step 1/1 [==============================] - 0s 22ms/step 1/1 [==============================] - 0s 22ms/step 1/1 [==============================] - 0s 29ms/step 1/1 [==============================] - 0s 22ms/step 1/1 [==============================] - 0s 22ms/step 1/1 [==============================] - 0s 22ms/step 1/1 [==============================] - 0s 27ms/step 1/1 [==============================] - 0s 24ms/step 1/1 [==============================] - 0s 22ms/step 1/1 [==============================] - 0s 22ms/step 1/1 [==============================] - 0s 24ms/step 1/1 [==============================] - 0s 26ms/step 1/1 [==============================] - 0s 24ms/step 1/1 [==============================] - 0s 31ms/step 1/1 [==============================] - 0s 24ms/step 1/1 [==============================] - 0s 23ms/step 1/1 [==============================] - 0s 23ms/step 1/1 [==============================] - 0s 22ms/step 1/1 [==============================] - 0s 22ms/step 1/1 [==============================] - 0s 31ms/step 1/1 [==============================] - 0s 36ms/step 1/1 [==============================] - 0s 22ms/step 1/1 [==============================] - 0s 23ms/step 1/1 [==============================] - 0s 22ms/step 1/1 [==============================] - 0s 35ms/step 1/1 [==============================] - 0s 23ms/step 1/1 [==============================] - 0s 28ms/step 1/1 [==============================] - 0s 24ms/step 1/1 [==============================] - 0s 21ms/step 1/1 [==============================] - 0s 23ms/step 1/1 [==============================] - 0s 22ms/step 1/1 [==============================] - 0s 28ms/step 1/1 [==============================] - 0s 21ms/step 1/1 [==============================] - 0s 22ms/step 1/1 [==============================] - 0s 27ms/step 1/1 [==============================] - 0s 21ms/step 1/1 [==============================] - 0s 23ms/step 1/1 [==============================] - 0s 26ms/step 1/1 [==============================] - 0s 21ms/step 1/1 [==============================] - 0s 23ms/step 1/1 [==============================] - 0s 21ms/step 1/1 [==============================] - 0s 22ms/step 1/1 [==============================] - 0s 23ms/step 1/1 [==============================] - 0s 24ms/step 1/1 [==============================] - 0s 32ms/step 1/1 [==============================] - 0s 25ms/step 1/1 [==============================] - 0s 22ms/step 1/1 [==============================] - 0s 22ms/step 1/1 [==============================] - 0s 23ms/step 1/1 [==============================] - 0s 22ms/step 1/1 [==============================] - 0s 34ms/step 1/1 [==============================] - 0s 23ms/step 1/1 [==============================] - 0s 24ms/step 1/1 [==============================] - 0s 22ms/step 1/1 [==============================] - 0s 23ms/step 1/1 [==============================] - 0s 23ms/step 1/1 [==============================] - 0s 25ms/step 1/1 [==============================] - 0s 24ms/step 1/1 [==============================] - 0s 22ms/step 1/1 [==============================] - 0s 26ms/step 1/1 [==============================] - 0s 26ms/step 1/1 [==============================] - 0s 22ms/step 1/1 [==============================] - 0s 35ms/step 1/1 [==============================] - 0s 27ms/step 1/1 [==============================] - 0s 22ms/step 1/1 [==============================] - 0s 25ms/step 1/1 [==============================] - 0s 24ms/step 1/1 [==============================] - 0s 24ms/step 1/1 [==============================] - 0s 23ms/step 1/1 [==============================] - 0s 24ms/step 1/1 [==============================] - 0s 21ms/step 1/1 [==============================] - 0s 21ms/step 1/1 [==============================] - 0s 23ms/step 1/1 [==============================] - 0s 23ms/step 1/1 [==============================] - 0s 23ms/step 1/1 [==============================] - 0s 22ms/step 1/1 [==============================] - 0s 26ms/step 1/1 [==============================] - 0s 31ms/step 1/1 [==============================] - 0s 23ms/step 1/1 [==============================] - 0s 22ms/step 1/1 [==============================] - 0s 37ms/step 1/1 [==============================] - 0s 32ms/step 1/1 [==============================] - 0s 23ms/step 1/1 [==============================] - 0s 22ms/step 1/1 [==============================] - 0s 24ms/step 1/1 [==============================] - 0s 21ms/step 1/1 [==============================] - 0s 23ms/step 1/1 [==============================] - 0s 31ms/step 1/1 [==============================] - 0s 21ms/step 1/1 [==============================] - 0s 24ms/step 1/1 [==============================] - 0s 24ms/step 1/1 [==============================] - 0s 21ms/step 1/1 [==============================] - 0s 24ms/step 1/1 [==============================] - 0s 22ms/step 1/1 [==============================] - 0s 21ms/step 1/1 [==============================] - 0s 23ms/step 1/1 [==============================] - 0s 21ms/step 1/1 [==============================] - 0s 24ms/step 1/1 [==============================] - 0s 23ms/step 1/1 [==============================] - 0s 23ms/step 1/1 [==============================] - 0s 24ms/step 1/1 [==============================] - 0s 22ms/step 1/1 [==============================] - 0s 23ms/step 1/1 [==============================] - 0s 22ms/step 1/1 [==============================] - 0s 22ms/step 1/1 [==============================] - 0s 22ms/step 1/1 [==============================] - 0s 25ms/step 1/1 [==============================] - 0s 22ms/step 1/1 [==============================] - 0s 28ms/step 1/1 [==============================] - 0s 21ms/step 1/1 [==============================] - 0s 23ms/step 1/1 [==============================] - 0s 22ms/step 1/1 [==============================] - 0s 23ms/step 1/1 [==============================] - 0s 23ms/step 1/1 [==============================] - 0s 36ms/step 1/1 [==============================] - 0s 21ms/step 1/1 [==============================] - 0s 23ms/step 1/1 [==============================] - 0s 26ms/step 1/1 [==============================] - 0s 23ms/step 1/1 [==============================] - 0s 29ms/step 1/1 [==============================] - 0s 22ms/step 1/1 [==============================] - 0s 23ms/step 1/1 [==============================] - 0s 22ms/step 1/1 [==============================] - 0s 22ms/step 1/1 [==============================] - 0s 28ms/step 1/1 [==============================] - 0s 22ms/step 1/1 [==============================] - 0s 45ms/step 1/1 [==============================] - 0s 24ms/step 1/1 [==============================] - 0s 22ms/step 1/1 [==============================] - 0s 23ms/step 1/1 [==============================] - 0s 23ms/step 1/1 [==============================] - 0s 22ms/step 1/1 [==============================] - 0s 25ms/step 1/1 [==============================] - 0s 23ms/step 1/1 [==============================] - 0s 22ms/step 1/1 [==============================] - 0s 21ms/step 1/1 [==============================] - 0s 32ms/step 1/1 [==============================] - 0s 24ms/step 1/1 [==============================] - 0s 26ms/step 1/1 [==============================] - 0s 23ms/step 1/1 [==============================] - 0s 40ms/step 1/1 [==============================] - 0s 38ms/step 1/1 [==============================] - 0s 34ms/step 1/1 [==============================] - 0s 24ms/step 1/1 [==============================] - 0s 24ms/step 1/1 [==============================] - 0s 23ms/step 1/1 [==============================] - 0s 23ms/step 1/1 [==============================] - 0s 27ms/step 1/1 [==============================] - 0s 37ms/step 1/1 [==============================] - 0s 24ms/step 1/1 [==============================] - 0s 24ms/step 1/1 [==============================] - 0s 23ms/step 1/1 [==============================] - 0s 23ms/step 1/1 [==============================] - 0s 25ms/step 1/1 [==============================] - 0s 27ms/step 1/1 [==============================] - 0s 24ms/step 1/1 [==============================] - 0s 24ms/step 1/1 [==============================] - 0s 25ms/step 1/1 [==============================] - 0s 24ms/step 1/1 [==============================] - 0s 23ms/step 1/1 [==============================] - 0s 23ms/step 1/1 [==============================] - 0s 26ms/step 1/1 [==============================] - 0s 25ms/step 1/1 [==============================] - 0s 25ms/step 1/1 [==============================] - 0s 25ms/step 1/1 [==============================] - 0s 25ms/step 1/1 [==============================] - 0s 24ms/step 1/1 [==============================] - 0s 23ms/step 1/1 [==============================] - 0s 22ms/step 1/1 [==============================] - 0s 27ms/step 1/1 [==============================] - 0s 25ms/step 1/1 [==============================] - 0s 25ms/step 1/1 [==============================] - 0s 26ms/step 1/1 [==============================] - 0s 23ms/step 1/1 [==============================] - 0s 24ms/step 1/1 [==============================] - 0s 23ms/step 1/1 [==============================] - 0s 31ms/step 1/1 [==============================] - 0s 25ms/step 1/1 [==============================] - 0s 32ms/step 1/1 [==============================] - 0s 25ms/step 1/1 [==============================] - 0s 30ms/step 1/1 [==============================] - 0s 24ms/step 1/1 [==============================] - 0s 24ms/step 1/1 [==============================] - 0s 24ms/step 1/1 [==============================] - 0s 24ms/step 1/1 [==============================] - 0s 25ms/step 1/1 [==============================] - 0s 29ms/step 1/1 [==============================] - 0s 28ms/step 1/1 [==============================] - 0s 24ms/step 1/1 [==============================] - 0s 26ms/step 1/1 [==============================] - 0s 27ms/step 1/1 [==============================] - 0s 24ms/step 1/1 [==============================] - 0s 27ms/step 1/1 [==============================] - 0s 24ms/step 1/1 [==============================] - 0s 22ms/step 1/1 [==============================] - 0s 23ms/step 1/1 [==============================] - 0s 24ms/step 1/1 [==============================] - 0s 23ms/step 1/1 [==============================] - 0s 37ms/step 1/1 [==============================] - 0s 25ms/step 1/1 [==============================] - 0s 22ms/step 1/1 [==============================] - 0s 24ms/step 1/1 [==============================] - 0s 26ms/step 1/1 [==============================] - 0s 27ms/step 1/1 [==============================] - 0s 26ms/step 1/1 [==============================] - 0s 28ms/step 1/1 [==============================] - 0s 23ms/step 1/1 [==============================] - 0s 26ms/step 1/1 [==============================] - 0s 23ms/step 1/1 [==============================] - 0s 26ms/step 1/1 [==============================] - 0s 26ms/step 1/1 [==============================] - 0s 25ms/step 1/1 [==============================] - 0s 28ms/step 1/1 [==============================] - 0s 25ms/step 1/1 [==============================] - 0s 26ms/step 1/1 [==============================] - 0s 26ms/step 1/1 [==============================] - 0s 24ms/step 1/1 [==============================] - 0s 24ms/step 1/1 [==============================] - 0s 25ms/step 1/1 [==============================] - 0s 23ms/step 1/1 [==============================] - 0s 24ms/step 1/1 [==============================] - 0s 24ms/step 1/1 [==============================] - 0s 26ms/step 1/1 [==============================] - 0s 28ms/step 1/1 [==============================] - 0s 23ms/step 1/1 [==============================] - 0s 25ms/step 1/1 [==============================] - 0s 23ms/step 1/1 [==============================] - 0s 24ms/step 1/1 [==============================] - 0s 25ms/step 1/1 [==============================] - 0s 22ms/step 1/1 [==============================] - 0s 34ms/step 1/1 [==============================] - 0s 27ms/step 1/1 [==============================] - 0s 26ms/step 1/1 [==============================] - 0s 23ms/step 1/1 [==============================] - 0s 23ms/step 1/1 [==============================] - 0s 23ms/step 1/1 [==============================] - 0s 24ms/step 1/1 [==============================] - 0s 24ms/step 1/1 [==============================] - 0s 25ms/step 1/1 [==============================] - 0s 31ms/step 1/1 [==============================] - 0s 22ms/step 1/1 [==============================] - 0s 24ms/step 1/1 [==============================] - 0s 25ms/step 1/1 [==============================] - 0s 25ms/step 1/1 [==============================] - 0s 25ms/step 1/1 [==============================] - 0s 26ms/step 1/1 [==============================] - 0s 24ms/step 1/1 [==============================] - 0s 21ms/step 1/1 [==============================] - 0s 37ms/step 1/1 [==============================] - 0s 26ms/step 1/1 [==============================] - 0s 26ms/step 1/1 [==============================] - 0s 26ms/step 1/1 [==============================] - 0s 24ms/step 1/1 [==============================] - 0s 31ms/step 1/1 [==============================] - 0s 24ms/step 1/1 [==============================] - 0s 23ms/step 1/1 [==============================] - 0s 23ms/step 1/1 [==============================] - 0s 29ms/step 1/1 [==============================] - 0s 27ms/step 1/1 [==============================] - 0s 23ms/step 1/1 [==============================] - 0s 24ms/step 1/1 [==============================] - 0s 24ms/step 1/1 [==============================] - 0s 22ms/step 1/1 [==============================] - 0s 23ms/step 1/1 [==============================] - 0s 34ms/step 1/1 [==============================] - 0s 24ms/step 1/1 [==============================] - 0s 21ms/step 1/1 [==============================] - 0s 24ms/step 1/1 [==============================] - 0s 28ms/step 1/1 [==============================] - 0s 23ms/step 1/1 [==============================] - 0s 24ms/step 1/1 [==============================] - 0s 22ms/step 1/1 [==============================] - 0s 22ms/step 1/1 [==============================] - 0s 31ms/step 1/1 [==============================] - 0s 25ms/step 1/1 [==============================] - 0s 22ms/step 1/1 [==============================] - 0s 22ms/step 1/1 [==============================] - 0s 25ms/step 1/1 [==============================] - 0s 27ms/step 1/1 [==============================] - 0s 23ms/step 1/1 [==============================] - 0s 23ms/step 1/1 [==============================] - 0s 29ms/step 1/1 [==============================] - 0s 23ms/step 1/1 [==============================] - 0s 25ms/step 1/1 [==============================] - 0s 22ms/step 1/1 [==============================] - 0s 24ms/step 1/1 [==============================] - 0s 26ms/step 1/1 [==============================] - 0s 27ms/step 1/1 [==============================] - 0s 23ms/step 1/1 [==============================] - 0s 33ms/step 1/1 [==============================] - 0s 24ms/step 1/1 [==============================] - 0s 26ms/step 1/1 [==============================] - 0s 28ms/step 1/1 [==============================] - 0s 21ms/step 1/1 [==============================] - 0s 25ms/step 1/1 [==============================] - 0s 23ms/step 1/1 [==============================] - 0s 26ms/step 1/1 [==============================] - 0s 23ms/step 1/1 [==============================] - 0s 23ms/step 1/1 [==============================] - 0s 24ms/step 1/1 [==============================] - 0s 25ms/step 1/1 [==============================] - 0s 23ms/step 1/1 [==============================] - 0s 28ms/step 1/1 [==============================] - 0s 25ms/step 1/1 [==============================] - 0s 23ms/step 1/1 [==============================] - 0s 23ms/step 1/1 [==============================] - 0s 23ms/step 1/1 [==============================] - 0s 24ms/step 1/1 [==============================] - 0s 27ms/step 1/1 [==============================] - 0s 36ms/step 1/1 [==============================] - 0s 24ms/step 1/1 [==============================] - 0s 22ms/step 1/1 [==============================] - 0s 23ms/step 1/1 [==============================] - 0s 23ms/step 1/1 [==============================] - 0s 26ms/step 1/1 [==============================] - 0s 25ms/step 1/1 [==============================] - 0s 24ms/step 1/1 [==============================] - 0s 25ms/step 1/1 [==============================] - 0s 22ms/step 1/1 [==============================] - 0s 22ms/step 1/1 [==============================] - 0s 23ms/step 1/1 [==============================] - 0s 25ms/step 1/1 [==============================] - 0s 24ms/step 1/1 [==============================] - 0s 23ms/step 1/1 [==============================] - 0s 23ms/step 1/1 [==============================] - 0s 23ms/step 1/1 [==============================] - 0s 23ms/step 1/1 [==============================] - 0s 24ms/step 1/1 [==============================] - 0s 28ms/step 1/1 [==============================] - 0s 22ms/step 1/1 [==============================] - 0s 24ms/step 1/1 [==============================] - 0s 29ms/step 1/1 [==============================] - 0s 24ms/step 1/1 [==============================] - 0s 22ms/step 1/1 [==============================] - 0s 22ms/step 1/1 [==============================] - 0s 23ms/step 1/1 [==============================] - 0s 24ms/step 1/1 [==============================] - 0s 23ms/step 1/1 [==============================] - 0s 23ms/step 1/1 [==============================] - 0s 24ms/step 1/1 [==============================] - 0s 39ms/step 1/1 [==============================] - 0s 32ms/step 1/1 [==============================] - 0s 22ms/step 1/1 [==============================] - 0s 22ms/step 1/1 [==============================] - 0s 27ms/step 1/1 [==============================] - 0s 24ms/step 1/1 [==============================] - 0s 23ms/step 1/1 [==============================] - 0s 23ms/step 1/1 [==============================] - 0s 23ms/step 1/1 [==============================] - 0s 22ms/step 1/1 [==============================] - 0s 23ms/step 1/1 [==============================] - 0s 23ms/step 1/1 [==============================] - 0s 23ms/step 1/1 [==============================] - 0s 25ms/step 1/1 [==============================] - 0s 24ms/step 1/1 [==============================] - 0s 27ms/step 1/1 [==============================] - 0s 35ms/step 1/1 [==============================] - 0s 25ms/step 1/1 [==============================] - 0s 28ms/step 1/1 [==============================] - 0s 23ms/step 1/1 [==============================] - 0s 24ms/step 1/1 [==============================] - 0s 24ms/step 1/1 [==============================] - 0s 24ms/step 1/1 [==============================] - 0s 36ms/step 1/1 [==============================] - 0s 28ms/step 1/1 [==============================] - 0s 22ms/step 1/1 [==============================] - 0s 22ms/step 1/1 [==============================] - 0s 22ms/step 1/1 [==============================] - 0s 26ms/step 1/1 [==============================] - 0s 22ms/step 1/1 [==============================] - 0s 23ms/step 1/1 [==============================] - 0s 23ms/step 1/1 [==============================] - 0s 23ms/step 1/1 [==============================] - 0s 27ms/step 1/1 [==============================] - 0s 26ms/step 1/1 [==============================] - 0s 38ms/step 1/1 [==============================] - 0s 23ms/step 1/1 [==============================] - 0s 22ms/step 1/1 [==============================] - 0s 23ms/step 1/1 [==============================] - 0s 22ms/step 1/1 [==============================] - 0s 22ms/step 1/1 [==============================] - 0s 22ms/step 1/1 [==============================] - 0s 29ms/step 1/1 [==============================] - 0s 22ms/step 1/1 [==============================] - 0s 24ms/step 1/1 [==============================] - 0s 24ms/step 1/1 [==============================] - 0s 26ms/step 1/1 [==============================] - 0s 23ms/step 1/1 [==============================] - 0s 23ms/step 1/1 [==============================] - 0s 23ms/step 1/1 [==============================] - 0s 25ms/step 1/1 [==============================] - 0s 24ms/step 1/1 [==============================] - 0s 24ms/step 1/1 [==============================] - 0s 24ms/step 1/1 [==============================] - 0s 22ms/step 1/1 [==============================] - 0s 24ms/step 1/1 [==============================] - 0s 25ms/step 1/1 [==============================] - 0s 25ms/step 1/1 [==============================] - 0s 22ms/step 1/1 [==============================] - 0s 23ms/step 1/1 [==============================] - 0s 27ms/step 1/1 [==============================] - 0s 35ms/step 1/1 [==============================] - 0s 23ms/step 1/1 [==============================] - 0s 24ms/step 1/1 [==============================] - 0s 23ms/step 1/1 [==============================] - 0s 25ms/step 1/1 [==============================] - 0s 24ms/step 1/1 [==============================] - 0s 22ms/step 1/1 [==============================] - 0s 23ms/step 1/1 [==============================] - 0s 22ms/step 1/1 [==============================] - 0s 23ms/step 1/1 [==============================] - 0s 23ms/step 1/1 [==============================] - 0s 22ms/step 1/1 [==============================] - 0s 23ms/step 1/1 [==============================] - 0s 21ms/step 1/1 [==============================] - 0s 25ms/step 1/1 [==============================] - 0s 24ms/step 1/1 [==============================] - 0s 24ms/step 1/1 [==============================] - 0s 39ms/step 1/1 [==============================] - 0s 21ms/step 1/1 [==============================] - 0s 31ms/step 1/1 [==============================] - 0s 25ms/step 1/1 [==============================] - 0s 24ms/step 1/1 [==============================] - 0s 22ms/step 1/1 [==============================] - 0s 23ms/step 1/1 [==============================] - 0s 22ms/step 1/1 [==============================] - 0s 24ms/step 1/1 [==============================] - 0s 24ms/step 1/1 [==============================] - 0s 24ms/step 1/1 [==============================] - 0s 29ms/step 1/1 [==============================] - 0s 31ms/step 1/1 [==============================] - 0s 22ms/step 1/1 [==============================] - 0s 22ms/step 1/1 [==============================] - 0s 37ms/step 1/1 [==============================] - 0s 30ms/step 1/1 [==============================] - 0s 22ms/step 1/1 [==============================] - 0s 25ms/step 1/1 [==============================] - 0s 27ms/step 1/1 [==============================] - 0s 24ms/step 1/1 [==============================] - 0s 22ms/step 1/1 [==============================] - 0s 22ms/step 1/1 [==============================] - 0s 25ms/step 1/1 [==============================] - 0s 27ms/step 1/1 [==============================] - 0s 23ms/step 1/1 [==============================] - 0s 23ms/step 1/1 [==============================] - 0s 25ms/step 1/1 [==============================] - 0s 26ms/step 1/1 [==============================] - 0s 26ms/step 1/1 [==============================] - 0s 28ms/step 1/1 [==============================] - 0s 25ms/step 1/1 [==============================] - 0s 24ms/step 1/1 [==============================] - 0s 28ms/step 1/1 [==============================] - 0s 23ms/step 1/1 [==============================] - 0s 27ms/step 1/1 [==============================] - 0s 24ms/step 1/1 [==============================] - 0s 27ms/step 1/1 [==============================] - 0s 22ms/step 1/1 [==============================] - 0s 23ms/step 1/1 [==============================] - 0s 22ms/step 1/1 [==============================] - 0s 24ms/step 1/1 [==============================] - 0s 28ms/step 1/1 [==============================] - 0s 23ms/step 1/1 [==============================] - 0s 23ms/step 1/1 [==============================] - 0s 23ms/step 1/1 [==============================] - 0s 29ms/step 1/1 [==============================] - 0s 30ms/step 1/1 [==============================] - 0s 23ms/step 1/1 [==============================] - 0s 25ms/step 1/1 [==============================] - 0s 26ms/step 1/1 [==============================] - 0s 22ms/step 1/1 [==============================] - 0s 23ms/step 1/1 [==============================] - 0s 29ms/step 1/1 [==============================] - 0s 33ms/step 1/1 [==============================] - 0s 24ms/step 1/1 [==============================] - 0s 25ms/step 1/1 [==============================] - 0s 25ms/step 1/1 [==============================] - 0s 22ms/step 1/1 [==============================] - 0s 27ms/step 1/1 [==============================] - 0s 38ms/step 1/1 [==============================] - 0s 28ms/step 1/1 [==============================] - 0s 29ms/step 1/1 [==============================] - 0s 31ms/step 1/1 [==============================] - 0s 22ms/step 1/1 [==============================] - 0s 28ms/step 1/1 [==============================] - 0s 27ms/step 1/1 [==============================] - 0s 25ms/step 1/1 [==============================] - 0s 22ms/step 1/1 [==============================] - 0s 22ms/step 1/1 [==============================] - 0s 24ms/step 1/1 [==============================] - 0s 23ms/step 1/1 [==============================] - 0s 30ms/step 1/1 [==============================] - 0s 30ms/step 1/1 [==============================] - 0s 24ms/step 1/1 [==============================] - 0s 24ms/step 1/1 [==============================] - 0s 25ms/step 1/1 [==============================] - 0s 23ms/step 1/1 [==============================] - 0s 33ms/step 1/1 [==============================] - 0s 23ms/step 1/1 [==============================] - 0s 27ms/step 1/1 [==============================] - 0s 23ms/step 1/1 [==============================] - 0s 21ms/step 1/1 [==============================] - 0s 31ms/step 1/1 [==============================] - 0s 21ms/step 1/1 [==============================] - 0s 25ms/step 1/1 [==============================] - 0s 24ms/step 1/1 [==============================] - 0s 38ms/step 1/1 [==============================] - 0s 22ms/step 1/1 [==============================] - 0s 25ms/step 1/1 [==============================] - 0s 22ms/step 1/1 [==============================] - 0s 25ms/step 1/1 [==============================] - 0s 22ms/step 1/1 [==============================] - 0s 23ms/step 1/1 [==============================] - 0s 24ms/step 1/1 [==============================] - 0s 24ms/step 1/1 [==============================] - 0s 25ms/step 1/1 [==============================] - 0s 22ms/step 1/1 [==============================] - 0s 24ms/step 1/1 [==============================] - 0s 21ms/step 1/1 [==============================] - 0s 21ms/step 1/1 [==============================] - 0s 22ms/step 1/1 [==============================] - 0s 23ms/step 1/1 [==============================] - 0s 23ms/step 1/1 [==============================] - 0s 24ms/step 1/1 [==============================] - 0s 30ms/step 1/1 [==============================] - 0s 30ms/step 1/1 [==============================] - 0s 34ms/step 1/1 [==============================] - 0s 33ms/step 1/1 [==============================] - 0s 24ms/step 1/1 [==============================] - 0s 26ms/step 1/1 [==============================] - 0s 24ms/step 1/1 [==============================] - 0s 24ms/step 1/1 [==============================] - 0s 28ms/step 1/1 [==============================] - 0s 24ms/step 1/1 [==============================] - 0s 22ms/step 1/1 [==============================] - 0s 30ms/step 1/1 [==============================] - 0s 23ms/step 1/1 [==============================] - 0s 26ms/step 1/1 [==============================] - 0s 24ms/step 1/1 [==============================] - 0s 24ms/step 1/1 [==============================] - 0s 26ms/step 1/1 [==============================] - 0s 23ms/step 1/1 [==============================] - 0s 31ms/step 1/1 [==============================] - 0s 24ms/step 1/1 [==============================] - 0s 26ms/step 1/1 [==============================] - 0s 24ms/step 1/1 [==============================] - 0s 22ms/step 1/1 [==============================] - 0s 22ms/step 1/1 [==============================] - 0s 22ms/step 1/1 [==============================] - 0s 28ms/step 1/1 [==============================] - 0s 23ms/step 1/1 [==============================] - 0s 25ms/step 1/1 [==============================] - 0s 24ms/step 1/1 [==============================] - 0s 30ms/step 1/1 [==============================] - 0s 22ms/step 1/1 [==============================] - 0s 27ms/step 1/1 [==============================] - 0s 31ms/step 1/1 [==============================] - 0s 23ms/step 1/1 [==============================] - 0s 22ms/step 1/1 [==============================] - 0s 22ms/step 1/1 [==============================] - 0s 23ms/step 1/1 [==============================] - 0s 38ms/step 1/1 [==============================] - 0s 25ms/step 1/1 [==============================] - 0s 25ms/step 1/1 [==============================] - 0s 24ms/step 1/1 [==============================] - 0s 22ms/step 1/1 [==============================] - 0s 23ms/step 1/1 [==============================] - 0s 33ms/step 1/1 [==============================] - 0s 24ms/step 1/1 [==============================] - 0s 22ms/step 1/1 [==============================] - 0s 26ms/step 1/1 [==============================] - 0s 22ms/step 1/1 [==============================] - 0s 25ms/step 1/1 [==============================] - 0s 33ms/step 1/1 [==============================] - 0s 22ms/step 1/1 [==============================] - 0s 23ms/step 1/1 [==============================] - 0s 23ms/step 1/1 [==============================] - 0s 23ms/step 1/1 [==============================] - 0s 23ms/step 1/1 [==============================] - 0s 21ms/step 1/1 [==============================] - 0s 22ms/step 1/1 [==============================] - 0s 23ms/step 1/1 [==============================] - 0s 25ms/step 1/1 [==============================] - 0s 24ms/step 1/1 [==============================] - 0s 25ms/step 1/1 [==============================] - 0s 23ms/step 1/1 [==============================] - 0s 24ms/step 1/1 [==============================] - 0s 26ms/step 1/1 [==============================] - 0s 24ms/step 1/1 [==============================] - 0s 23ms/step 1/1 [==============================] - 0s 23ms/step 1/1 [==============================] - 0s 23ms/step 1/1 [==============================] - 0s 23ms/step 1/1 [==============================] - 0s 22ms/step 1/1 [==============================] - 0s 23ms/step 1/1 [==============================] - 0s 24ms/step 1/1 [==============================] - 0s 29ms/step 1/1 [==============================] - 0s 24ms/step 1/1 [==============================] - 0s 24ms/step 1/1 [==============================] - 0s 24ms/step 1/1 [==============================] - 0s 23ms/step 1/1 [==============================] - 0s 26ms/step 1/1 [==============================] - 0s 25ms/step 1/1 [==============================] - 0s 21ms/step 1/1 [==============================] - 0s 30ms/step 1/1 [==============================] - 0s 31ms/step 1/1 [==============================] - 0s 24ms/step 1/1 [==============================] - 0s 30ms/step 1/1 [==============================] - 0s 23ms/step 1/1 [==============================] - 0s 23ms/step 1/1 [==============================] - 0s 27ms/step 1/1 [==============================] - 0s 29ms/step 1/1 [==============================] - 0s 31ms/step 1/1 [==============================] - 0s 24ms/step 1/1 [==============================] - 0s 23ms/step 1/1 [==============================] - 0s 23ms/step 1/1 [==============================] - 0s 30ms/step 1/1 [==============================] - 0s 25ms/step 1/1 [==============================] - 0s 23ms/step 1/1 [==============================] - 0s 25ms/step 1/1 [==============================] - 0s 38ms/step 1/1 [==============================] - 0s 23ms/step 1/1 [==============================] - 0s 25ms/step 1/1 [==============================] - 0s 28ms/step 1/1 [==============================] - 0s 23ms/step 1/1 [==============================] - 0s 24ms/step 1/1 [==============================] - 0s 26ms/step 1/1 [==============================] - 0s 24ms/step 1/1 [==============================] - 0s 40ms/step 1/1 [==============================] - 0s 25ms/step 1/1 [==============================] - 0s 23ms/step 1/1 [==============================] - 0s 23ms/step 1/1 [==============================] - 0s 29ms/step 1/1 [==============================] - 0s 23ms/step 1/1 [==============================] - 0s 31ms/step 1/1 [==============================] - 0s 31ms/step 1/1 [==============================] - 0s 23ms/step 1/1 [==============================] - 0s 22ms/step 1/1 [==============================] - 0s 32ms/step 1/1 [==============================] - 0s 24ms/step 1/1 [==============================] - 0s 27ms/step 1/1 [==============================] - 0s 23ms/step 1/1 [==============================] - 0s 35ms/step 1/1 [==============================] - 0s 22ms/step 1/1 [==============================] - 0s 22ms/step 1/1 [==============================] - 0s 28ms/step 1/1 [==============================] - 0s 23ms/step 1/1 [==============================] - 0s 23ms/step 1/1 [==============================] - 0s 28ms/step 1/1 [==============================] - 0s 36ms/step 1/1 [==============================] - 0s 23ms/step 1/1 [==============================] - 0s 23ms/step 1/1 [==============================] - 0s 22ms/step 1/1 [==============================] - 0s 23ms/step 1/1 [==============================] - 0s 23ms/step 1/1 [==============================] - 0s 23ms/step 1/1 [==============================] - 0s 23ms/step 1/1 [==============================] - 0s 23ms/step 1/1 [==============================] - 0s 24ms/step 1/1 [==============================] - 0s 30ms/step 1/1 [==============================] - 0s 22ms/step 1/1 [==============================] - 0s 34ms/step 1/1 [==============================] - 0s 24ms/step 1/1 [==============================] - 0s 25ms/step 1/1 [==============================] - 0s 23ms/step 1/1 [==============================] - 0s 28ms/step 1/1 [==============================] - 0s 24ms/step 1/1 [==============================] - 0s 24ms/step 1/1 [==============================] - 0s 23ms/step 1/1 [==============================] - 0s 24ms/step 1/1 [==============================] - 0s 21ms/step 1/1 [==============================] - 0s 23ms/step 1/1 [==============================] - 0s 23ms/step 1/1 [==============================] - 0s 25ms/step 1/1 [==============================] - 0s 23ms/step 1/1 [==============================] - 0s 32ms/step 1/1 [==============================] - 0s 23ms/step 1/1 [==============================] - 0s 27ms/step 1/1 [==============================] - 0s 27ms/step 1/1 [==============================] - 0s 23ms/step 1/1 [==============================] - 0s 26ms/step 1/1 [==============================] - 0s 26ms/step 1/1 [==============================] - 0s 26ms/step 1/1 [==============================] - 0s 27ms/step 1/1 [==============================] - 0s 53ms/step 1/1 [==============================] - 0s 26ms/step 1/1 [==============================] - 0s 24ms/step 1/1 [==============================] - 0s 31ms/step 1/1 [==============================] - 0s 34ms/step 1/1 [==============================] - 0s 23ms/step 1/1 [==============================] - 0s 25ms/step 1/1 [==============================] - 0s 22ms/step 1/1 [==============================] - 0s 22ms/step 1/1 [==============================] - 0s 41ms/step 1/1 [==============================] - 0s 23ms/step 1/1 [==============================] - 0s 24ms/step 1/1 [==============================] - 0s 23ms/step 1/1 [==============================] - 0s 22ms/step 1/1 [==============================] - 0s 28ms/step 1/1 [==============================] - 0s 23ms/step 1/1 [==============================] - 0s 25ms/step 1/1 [==============================] - 0s 24ms/step 1/1 [==============================] - 0s 33ms/step 1/1 [==============================] - 0s 22ms/step 1/1 [==============================] - 0s 23ms/step 1/1 [==============================] - 0s 24ms/step 1/1 [==============================] - 0s 23ms/step 1/1 [==============================] - 0s 24ms/step 1/1 [==============================] - 0s 23ms/step 1/1 [==============================] - 0s 24ms/step 1/1 [==============================] - 0s 24ms/step 1/1 [==============================] - 0s 23ms/step 1/1 [==============================] - 0s 24ms/step 1/1 [==============================] - 0s 38ms/step 1/1 [==============================] - 0s 23ms/step 1/1 [==============================] - 0s 27ms/step 1/1 [==============================] - 0s 28ms/step 1/1 [==============================] - 0s 23ms/step 1/1 [==============================] - 0s 25ms/step 1/1 [==============================] - 0s 24ms/step 1/1 [==============================] - 0s 25ms/step 1/1 [==============================] - 0s 25ms/step 1/1 [==============================] - 0s 23ms/step 1/1 [==============================] - 0s 24ms/step 1/1 [==============================] - 0s 23ms/step 1/1 [==============================] - 0s 24ms/step 1/1 [==============================] - 0s 23ms/step 1/1 [==============================] - 0s 24ms/step 1/1 [==============================] - 0s 25ms/step 1/1 [==============================] - 0s 23ms/step 1/1 [==============================] - 0s 27ms/step 1/1 [==============================] - 0s 27ms/step 1/1 [==============================] - 0s 25ms/step 1/1 [==============================] - 0s 24ms/step 1/1 [==============================] - 0s 22ms/step 1/1 [==============================] - 0s 24ms/step 1/1 [==============================] - 0s 23ms/step 1/1 [==============================] - 0s 30ms/step 1/1 [==============================] - 0s 25ms/step 1/1 [==============================] - 0s 22ms/step 1/1 [==============================] - 0s 30ms/step 1/1 [==============================] - 0s 31ms/step 1/1 [==============================] - 0s 25ms/step 1/1 [==============================] - 0s 24ms/step 1/1 [==============================] - 0s 24ms/step 1/1 [==============================] - 0s 24ms/step 1/1 [==============================] - 0s 26ms/step 1/1 [==============================] - 0s 31ms/step 1/1 [==============================] - 0s 25ms/step 1/1 [==============================] - 0s 27ms/step 1/1 [==============================] - 0s 26ms/step 1/1 [==============================] - 0s 29ms/step 1/1 [==============================] - 0s 22ms/step 1/1 [==============================] - 0s 25ms/step 1/1 [==============================] - 0s 23ms/step 1/1 [==============================] - 0s 25ms/step 1/1 [==============================] - 0s 27ms/step 1/1 [==============================] - 0s 23ms/step 1/1 [==============================] - 0s 24ms/step 1/1 [==============================] - 0s 24ms/step 1/1 [==============================] - 0s 24ms/step 1/1 [==============================] - 0s 24ms/step 1/1 [==============================] - 0s 25ms/step 1/1 [==============================] - 0s 26ms/step 1/1 [==============================] - 0s 24ms/step 1/1 [==============================] - 0s 41ms/step 1/1 [==============================] - 0s 25ms/step 1/1 [==============================] - 0s 25ms/step 1/1 [==============================] - 0s 24ms/step 1/1 [==============================] - 0s 23ms/step 1/1 [==============================] - 0s 27ms/step 1/1 [==============================] - 0s 25ms/step 1/1 [==============================] - 0s 24ms/step 1/1 [==============================] - 0s 35ms/step 1/1 [==============================] - 0s 23ms/step 1/1 [==============================] - 0s 27ms/step 1/1 [==============================] - 0s 26ms/step 1/1 [==============================] - 0s 25ms/step 1/1 [==============================] - 0s 26ms/step 1/1 [==============================] - 0s 24ms/step 1/1 [==============================] - 0s 27ms/step 1/1 [==============================] - 0s 25ms/step 1/1 [==============================] - 0s 23ms/step 1/1 [==============================] - 0s 24ms/step 1/1 [==============================] - 0s 24ms/step 1/1 [==============================] - 0s 29ms/step 1/1 [==============================] - 0s 25ms/step 1/1 [==============================] - 0s 38ms/step 1/1 [==============================] - 0s 28ms/step 1/1 [==============================] - 0s 23ms/step 1/1 [==============================] - 0s 23ms/step 1/1 [==============================] - 0s 26ms/step 1/1 [==============================] - 0s 24ms/step 1/1 [==============================] - 0s 25ms/step 1/1 [==============================] - 0s 23ms/step 1/1 [==============================] - 0s 26ms/step 1/1 [==============================] - 0s 23ms/step 1/1 [==============================] - 0s 28ms/step 1/1 [==============================] - 0s 23ms/step 1/1 [==============================] - 0s 22ms/step 1/1 [==============================] - 0s 29ms/step 1/1 [==============================] - 0s 33ms/step 1/1 [==============================] - 0s 26ms/step 1/1 [==============================] - 0s 23ms/step 1/1 [==============================] - 0s 26ms/step 1/1 [==============================] - 0s 34ms/step 1/1 [==============================] - 0s 23ms/step 1/1 [==============================] - 0s 35ms/step 1/1 [==============================] - 0s 28ms/step 1/1 [==============================] - 0s 39ms/step 1/1 [==============================] - 0s 24ms/step 1/1 [==============================] - 0s 24ms/step 1/1 [==============================] - 0s 35ms/step 1/1 [==============================] - 0s 24ms/step 1/1 [==============================] - 0s 25ms/step 1/1 [==============================] - 0s 24ms/step 1/1 [==============================] - 0s 26ms/step 1/1 [==============================] - 0s 26ms/step 1/1 [==============================] - 0s 30ms/step 1/1 [==============================] - 0s 23ms/step 1/1 [==============================] - 0s 25ms/step 1/1 [==============================] - 0s 23ms/step 1/1 [==============================] - 0s 37ms/step 1/1 [==============================] - 0s 31ms/step 1/1 [==============================] - 0s 24ms/step 1/1 [==============================] - 0s 23ms/step 1/1 [==============================] - 0s 23ms/step 1/1 [==============================] - 0s 30ms/step 1/1 [==============================] - 0s 23ms/step 1/1 [==============================] - 0s 25ms/step 1/1 [==============================] - 0s 24ms/step 1/1 [==============================] - 0s 24ms/step 1/1 [==============================] - 0s 24ms/step 1/1 [==============================] - 0s 34ms/step 1/1 [==============================] - 0s 24ms/step 1/1 [==============================] - 0s 23ms/step 1/1 [==============================] - 0s 26ms/step 1/1 [==============================] - 0s 27ms/step 1/1 [==============================] - 0s 27ms/step 1/1 [==============================] - 0s 23ms/step 1/1 [==============================] - 0s 31ms/step 1/1 [==============================] - 0s 24ms/step 1/1 [==============================] - 0s 27ms/step 1/1 [==============================] - 0s 22ms/step 1/1 [==============================] - 0s 50ms/step 1/1 [==============================] - 0s 37ms/step 1/1 [==============================] - 0s 41ms/step 1/1 [==============================] - 0s 23ms/step 1/1 [==============================] - 0s 27ms/step 1/1 [==============================] - 0s 26ms/step 1/1 [==============================] - 0s 24ms/step 1/1 [==============================] - 0s 35ms/step 1/1 [==============================] - 0s 24ms/step 1/1 [==============================] - 0s 32ms/step 1/1 [==============================] - 0s 25ms/step 1/1 [==============================] - 0s 24ms/step 1/1 [==============================] - 0s 24ms/step 1/1 [==============================] - 0s 37ms/step 1/1 [==============================] - 0s 23ms/step 1/1 [==============================] - 0s 25ms/step 1/1 [==============================] - 0s 25ms/step 1/1 [==============================] - 0s 36ms/step 1/1 [==============================] - 0s 24ms/step 1/1 [==============================] - 0s 24ms/step 1/1 [==============================] - 0s 35ms/step 1/1 [==============================] - 0s 26ms/step 1/1 [==============================] - 0s 36ms/step 1/1 [==============================] - 0s 25ms/step 1/1 [==============================] - 0s 26ms/step 1/1 [==============================] - 0s 23ms/step 1/1 [==============================] - 0s 22ms/step 1/1 [==============================] - 0s 23ms/step 1/1 [==============================] - 0s 26ms/step 1/1 [==============================] - 0s 34ms/step 1/1 [==============================] - 0s 36ms/step 1/1 [==============================] - 0s 26ms/step 1/1 [==============================] - 0s 25ms/step 1/1 [==============================] - 0s 33ms/step 1/1 [==============================] - 0s 37ms/step 1/1 [==============================] - 0s 24ms/step 1/1 [==============================] - 0s 36ms/step 1/1 [==============================] - 0s 26ms/step 1/1 [==============================] - 0s 25ms/step 1/1 [==============================] - 0s 36ms/step 1/1 [==============================] - 0s 32ms/step 1/1 [==============================] - 0s 26ms/step 1/1 [==============================] - 0s 29ms/step 1/1 [==============================] - 0s 25ms/step 1/1 [==============================] - 0s 25ms/step 1/1 [==============================] - 0s 30ms/step 1/1 [==============================] - 0s 35ms/step 1/1 [==============================] - 0s 26ms/step 1/1 [==============================] - 0s 23ms/step 1/1 [==============================] - 0s 23ms/step 1/1 [==============================] - 0s 26ms/step 1/1 [==============================] - 0s 28ms/step 1/1 [==============================] - 0s 24ms/step 1/1 [==============================] - 0s 25ms/step 1/1 [==============================] - 0s 24ms/step 1/1 [==============================] - 0s 24ms/step 1/1 [==============================] - 0s 29ms/step 1/1 [==============================] - 0s 24ms/step 1/1 [==============================] - 0s 35ms/step 1/1 [==============================] - 0s 25ms/step 1/1 [==============================] - 0s 24ms/step 1/1 [==============================] - 0s 23ms/step 1/1 [==============================] - 0s 27ms/step 1/1 [==============================] - 0s 24ms/step 1/1 [==============================] - 0s 25ms/step 1/1 [==============================] - 0s 29ms/step 1/1 [==============================] - 0s 28ms/step 1/1 [==============================] - 0s 23ms/step 1/1 [==============================] - 0s 36ms/step 1/1 [==============================] - 0s 25ms/step 1/1 [==============================] - 0s 25ms/step 1/1 [==============================] - 0s 25ms/step 1/1 [==============================] - 0s 26ms/step 1/1 [==============================] - 0s 34ms/step 1/1 [==============================] - 0s 27ms/step 1/1 [==============================] - 0s 28ms/step 1/1 [==============================] - 0s 23ms/step 1/1 [==============================] - 0s 25ms/step 1/1 [==============================] - 0s 27ms/step 1/1 [==============================] - 0s 24ms/step 1/1 [==============================] - 0s 24ms/step 1/1 [==============================] - 0s 26ms/step 1/1 [==============================] - 0s 27ms/step 1/1 [==============================] - 0s 24ms/step 1/1 [==============================] - 0s 25ms/step 1/1 [==============================] - 0s 26ms/step 1/1 [==============================] - 0s 44ms/step 1/1 [==============================] - 0s 27ms/step 1/1 [==============================] - 0s 32ms/step 1/1 [==============================] - 0s 25ms/step 1/1 [==============================] - 0s 27ms/step 1/1 [==============================] - 0s 26ms/step 1/1 [==============================] - 0s 25ms/step 1/1 [==============================] - 0s 28ms/step 1/1 [==============================] - 0s 39ms/step 1/1 [==============================] - 0s 28ms/step 1/1 [==============================] - 0s 41ms/step 1/1 [==============================] - 0s 26ms/step 1/1 [==============================] - 0s 39ms/step 1/1 [==============================] - 0s 24ms/step 1/1 [==============================] - 0s 32ms/step 1/1 [==============================] - 0s 24ms/step 1/1 [==============================] - 0s 25ms/step 1/1 [==============================] - 0s 23ms/step 1/1 [==============================] - 0s 26ms/step 1/1 [==============================] - 0s 25ms/step 1/1 [==============================] - 0s 29ms/step 1/1 [==============================] - 0s 25ms/step 1/1 [==============================] - 0s 24ms/step 1/1 [==============================] - 0s 26ms/step 1/1 [==============================] - 0s 27ms/step 1/1 [==============================] - 0s 42ms/step 1/1 [==============================] - 0s 23ms/step 1/1 [==============================] - 0s 33ms/step 1/1 [==============================] - 0s 30ms/step 1/1 [==============================] - 0s 24ms/step 1/1 [==============================] - 0s 35ms/step 1/1 [==============================] - 0s 27ms/step 1/1 [==============================] - 0s 43ms/step 1/1 [==============================] - 0s 26ms/step 1/1 [==============================] - 0s 29ms/step 1/1 [==============================] - 0s 24ms/step 1/1 [==============================] - 0s 24ms/step 1/1 [==============================] - 0s 27ms/step 1/1 [==============================] - 0s 26ms/step 1/1 [==============================] - 0s 25ms/step 1/1 [==============================] - 0s 25ms/step 1/1 [==============================] - 0s 25ms/step 1/1 [==============================] - 0s 24ms/step 1/1 [==============================] - 0s 24ms/step 1/1 [==============================] - 0s 54ms/step 1/1 [==============================] - 0s 27ms/step 1/1 [==============================] - 0s 23ms/step 1/1 [==============================] - 0s 26ms/step 1/1 [==============================] - 0s 52ms/step 1/1 [==============================] - 0s 26ms/step 1/1 [==============================] - 0s 25ms/step 1/1 [==============================] - 0s 26ms/step 1/1 [==============================] - 0s 28ms/step 1/1 [==============================] - 0s 26ms/step 1/1 [==============================] - 0s 35ms/step 1/1 [==============================] - 0s 24ms/step 1/1 [==============================] - 0s 25ms/step 1/1 [==============================] - 0s 28ms/step 1/1 [==============================] - 0s 31ms/step 1/1 [==============================] - 0s 34ms/step 1/1 [==============================] - 0s 25ms/step 1/1 [==============================] - 0s 28ms/step 1/1 [==============================] - 0s 27ms/step 1/1 [==============================] - 0s 37ms/step 1/1 [==============================] - 0s 29ms/step 1/1 [==============================] - 0s 27ms/step 1/1 [==============================] - 0s 24ms/step 1/1 [==============================] - 0s 24ms/step 1/1 [==============================] - 0s 24ms/step 1/1 [==============================] - 0s 24ms/step 1/1 [==============================] - 0s 25ms/step 1/1 [==============================] - 0s 27ms/step 1/1 [==============================] - 0s 24ms/step 1/1 [==============================] - 0s 28ms/step 1/1 [==============================] - 0s 24ms/step 1/1 [==============================] - 0s 35ms/step 1/1 [==============================] - 0s 28ms/step 1/1 [==============================] - 0s 32ms/step 1/1 [==============================] - 0s 24ms/step 1/1 [==============================] - 0s 26ms/step 1/1 [==============================] - 0s 27ms/step 1/1 [==============================] - 0s 28ms/step 1/1 [==============================] - 0s 24ms/step 1/1 [==============================] - 0s 24ms/step 1/1 [==============================] - 0s 25ms/step 1/1 [==============================] - 0s 32ms/step 1/1 [==============================] - 0s 23ms/step 1/1 [==============================] - 0s 24ms/step 1/1 [==============================] - 0s 25ms/step 1/1 [==============================] - 0s 26ms/step 1/1 [==============================] - 0s 26ms/step 1/1 [==============================] - 0s 25ms/step 1/1 [==============================] - 0s 27ms/step 1/1 [==============================] - 0s 25ms/step 1/1 [==============================] - 0s 25ms/step 1/1 [==============================] - 0s 26ms/step 1/1 [==============================] - 0s 31ms/step 1/1 [==============================] - 0s 34ms/step 1/1 [==============================] - 0s 24ms/step 1/1 [==============================] - 0s 25ms/step 1/1 [==============================] - 0s 28ms/step 1/1 [==============================] - 0s 25ms/step 1/1 [==============================] - 0s 33ms/step 1/1 [==============================] - 0s 25ms/step 1/1 [==============================] - 0s 29ms/step 1/1 [==============================] - 0s 27ms/step 1/1 [==============================] - 0s 24ms/step 1/1 [==============================] - 0s 24ms/step 1/1 [==============================] - 0s 25ms/step 1/1 [==============================] - 0s 25ms/step 1/1 [==============================] - 0s 25ms/step 1/1 [==============================] - 0s 25ms/step 1/1 [==============================] - 0s 24ms/step 1/1 [==============================] - 0s 25ms/step 1/1 [==============================] - 0s 25ms/step 1/1 [==============================] - 0s 32ms/step 1/1 [==============================] - 0s 32ms/step 1/1 [==============================] - 0s 24ms/step 1/1 [==============================] - 0s 25ms/step 1/1 [==============================] - 0s 31ms/step 1/1 [==============================] - 0s 25ms/step 1/1 [==============================] - 0s 28ms/step 1/1 [==============================] - 0s 41ms/step 1/1 [==============================] - 0s 25ms/step 1/1 [==============================] - 0s 24ms/step 1/1 [==============================] - 0s 26ms/step 1/1 [==============================] - 0s 29ms/step 1/1 [==============================] - 0s 27ms/step 1/1 [==============================] - 0s 23ms/step 1/1 [==============================] - 0s 26ms/step 1/1 [==============================] - 0s 26ms/step 1/1 [==============================] - 0s 25ms/step 1/1 [==============================] - 0s 25ms/step 1/1 [==============================] - 0s 26ms/step 1/1 [==============================] - 0s 30ms/step 1/1 [==============================] - 0s 26ms/step 1/1 [==============================] - 0s 24ms/step 1/1 [==============================] - 0s 23ms/step 1/1 [==============================] - 0s 25ms/step 1/1 [==============================] - 0s 25ms/step 1/1 [==============================] - 0s 33ms/step 1/1 [==============================] - 0s 25ms/step 1/1 [==============================] - 0s 26ms/step 1/1 [==============================] - 0s 25ms/step 1/1 [==============================] - 0s 24ms/step 1/1 [==============================] - 0s 24ms/step 1/1 [==============================] - 0s 33ms/step 1/1 [==============================] - 0s 23ms/step 1/1 [==============================] - 0s 24ms/step 1/1 [==============================] - 0s 27ms/step 1/1 [==============================] - 0s 32ms/step 1/1 [==============================] - 0s 27ms/step 1/1 [==============================] - 0s 26ms/step 1/1 [==============================] - 0s 32ms/step 1/1 [==============================] - 0s 25ms/step 1/1 [==============================] - 0s 25ms/step 1/1 [==============================] - 0s 24ms/step 1/1 [==============================] - 0s 35ms/step 1/1 [==============================] - 0s 26ms/step 1/1 [==============================] - 0s 26ms/step 1/1 [==============================] - 0s 27ms/step 1/1 [==============================] - 0s 24ms/step 1/1 [==============================] - 0s 24ms/step 1/1 [==============================] - 0s 25ms/step 1/1 [==============================] - 0s 24ms/step 1/1 [==============================] - 0s 25ms/step 1/1 [==============================] - 0s 24ms/step 1/1 [==============================] - 0s 26ms/step 1/1 [==============================] - 0s 27ms/step 1/1 [==============================] - 0s 26ms/step 1/1 [==============================] - 0s 35ms/step 1/1 [==============================] - 0s 26ms/step 1/1 [==============================] - 0s 34ms/step 1/1 [==============================] - 0s 27ms/step 1/1 [==============================] - 0s 25ms/step 1/1 [==============================] - 0s 29ms/step 1/1 [==============================] - 0s 28ms/step 1/1 [==============================] - 0s 27ms/step 1/1 [==============================] - 0s 26ms/step 1/1 [==============================] - 0s 25ms/step 1/1 [==============================] - 0s 36ms/step 1/1 [==============================] - 0s 25ms/step 1/1 [==============================] - 0s 25ms/step 1/1 [==============================] - 0s 25ms/step 1/1 [==============================] - 0s 26ms/step 1/1 [==============================] - 0s 28ms/step 1/1 [==============================] - 0s 25ms/step 1/1 [==============================] - 0s 25ms/step 1/1 [==============================] - 0s 31ms/step 1/1 [==============================] - 0s 24ms/step 1/1 [==============================] - 0s 27ms/step 1/1 [==============================] - 0s 26ms/step 1/1 [==============================] - 0s 24ms/step 1/1 [==============================] - 0s 37ms/step 1/1 [==============================] - 0s 25ms/step 1/1 [==============================] - 0s 26ms/step 1/1 [==============================] - 0s 25ms/step 1/1 [==============================] - 0s 25ms/step 1/1 [==============================] - 0s 30ms/step 1/1 [==============================] - 0s 35ms/step 1/1 [==============================] - 0s 26ms/step 1/1 [==============================] - 0s 26ms/step 1/1 [==============================] - 0s 25ms/step 1/1 [==============================] - 0s 28ms/step 1/1 [==============================] - 0s 42ms/step 1/1 [==============================] - 0s 27ms/step 1/1 [==============================] - 0s 47ms/step 1/1 [==============================] - 0s 32ms/step 1/1 [==============================] - 0s 24ms/step 1/1 [==============================] - 0s 35ms/step 1/1 [==============================] - 0s 24ms/step 1/1 [==============================] - 0s 25ms/step 1/1 [==============================] - 0s 24ms/step 1/1 [==============================] - 0s 25ms/step 1/1 [==============================] - 0s 25ms/step 1/1 [==============================] - 0s 35ms/step 1/1 [==============================] - 0s 29ms/step 1/1 [==============================] - 0s 29ms/step 1/1 [==============================] - 0s 25ms/step 1/1 [==============================] - 0s 25ms/step 1/1 [==============================] - 0s 31ms/step 1/1 [==============================] - 0s 37ms/step 1/1 [==============================] - 0s 25ms/step 1/1 [==============================] - 0s 29ms/step 1/1 [==============================] - 0s 26ms/step 1/1 [==============================] - 0s 24ms/step 1/1 [==============================] - 0s 30ms/step 1/1 [==============================] - 0s 25ms/step 1/1 [==============================] - 0s 25ms/step 1/1 [==============================] - 0s 24ms/step 1/1 [==============================] - 0s 25ms/step 1/1 [==============================] - 0s 26ms/step 1/1 [==============================] - 0s 24ms/step 1/1 [==============================] - 0s 37ms/step 1/1 [==============================] - 0s 26ms/step 1/1 [==============================] - 0s 30ms/step 1/1 [==============================] - 0s 26ms/step 1/1 [==============================] - 0s 25ms/step 1/1 [==============================] - 0s 25ms/step 1/1 [==============================] - 0s 24ms/step 1/1 [==============================] - 0s 28ms/step 1/1 [==============================] - 0s 28ms/step 1/1 [==============================] - 0s 26ms/step 1/1 [==============================] - 0s 28ms/step 1/1 [==============================] - 0s 25ms/step 1/1 [==============================] - 0s 23ms/step 1/1 [==============================] - 0s 26ms/step 1/1 [==============================] - 0s 26ms/step 1/1 [==============================] - 0s 32ms/step 1/1 [==============================] - 0s 29ms/step 1/1 [==============================] - 0s 33ms/step 1/1 [==============================] - 0s 36ms/step 1/1 [==============================] - 0s 26ms/step 1/1 [==============================] - 0s 25ms/step 1/1 [==============================] - 0s 24ms/step 1/1 [==============================] - 0s 30ms/step 1/1 [==============================] - 0s 28ms/step 1/1 [==============================] - 0s 25ms/step 1/1 [==============================] - 0s 26ms/step 1/1 [==============================] - 0s 26ms/step 1/1 [==============================] - 0s 27ms/step 1/1 [==============================] - 0s 38ms/step 1/1 [==============================] - 0s 31ms/step 1/1 [==============================] - 0s 32ms/step 1/1 [==============================] - 0s 33ms/step 1/1 [==============================] - 0s 24ms/step 1/1 [==============================] - 0s 27ms/step 1/1 [==============================] - 0s 27ms/step 1/1 [==============================] - 0s 25ms/step 1/1 [==============================] - 0s 26ms/step 1/1 [==============================] - 0s 27ms/step 1/1 [==============================] - 0s 26ms/step 1/1 [==============================] - 0s 26ms/step 1/1 [==============================] - 0s 35ms/step 1/1 [==============================] - 0s 28ms/step 1/1 [==============================] - 0s 28ms/step 1/1 [==============================] - 0s 33ms/step 1/1 [==============================] - 0s 28ms/step 1/1 [==============================] - 0s 31ms/step 1/1 [==============================] - 0s 27ms/step 1/1 [==============================] - 0s 28ms/step 1/1 [==============================] - 0s 26ms/step 1/1 [==============================] - 0s 25ms/step 1/1 [==============================] - 0s 26ms/step 1/1 [==============================] - 0s 30ms/step 1/1 [==============================] - 0s 25ms/step 1/1 [==============================] - 0s 25ms/step 1/1 [==============================] - 0s 24ms/step 1/1 [==============================] - 0s 27ms/step 1/1 [==============================] - 0s 28ms/step 1/1 [==============================] - 0s 25ms/step 1/1 [==============================] - 0s 25ms/step 1/1 [==============================] - 0s 27ms/step 1/1 [==============================] - 0s 24ms/step 1/1 [==============================] - 0s 35ms/step 1/1 [==============================] - 0s 27ms/step 1/1 [==============================] - 0s 32ms/step 1/1 [==============================] - 0s 28ms/step 1/1 [==============================] - 0s 25ms/step 1/1 [==============================] - 0s 27ms/step 1/1 [==============================] - 0s 24ms/step 1/1 [==============================] - 0s 32ms/step 1/1 [==============================] - 0s 32ms/step 1/1 [==============================] - 0s 31ms/step 1/1 [==============================] - 0s 29ms/step 1/1 [==============================] - 0s 26ms/step 1/1 [==============================] - 0s 29ms/step 1/1 [==============================] - 0s 25ms/step 1/1 [==============================] - 0s 42ms/step 1/1 [==============================] - 0s 25ms/step 1/1 [==============================] - 0s 29ms/step 1/1 [==============================] - 0s 36ms/step 1/1 [==============================] - 0s 25ms/step 1/1 [==============================] - 0s 28ms/step 1/1 [==============================] - 0s 27ms/step 1/1 [==============================] - 0s 23ms/step 1/1 [==============================] - 0s 25ms/step 1/1 [==============================] - 0s 25ms/step 1/1 [==============================] - 0s 25ms/step 1/1 [==============================] - 0s 34ms/step 1/1 [==============================] - 0s 36ms/step 1/1 [==============================] - 0s 35ms/step 1/1 [==============================] - 0s 28ms/step 1/1 [==============================] - 0s 27ms/step 1/1 [==============================] - 0s 29ms/step 1/1 [==============================] - 0s 35ms/step 1/1 [==============================] - 0s 26ms/step 1/1 [==============================] - 0s 28ms/step 1/1 [==============================] - 0s 41ms/step 1/1 [==============================] - 0s 33ms/step 1/1 [==============================] - 0s 26ms/step 1/1 [==============================] - 0s 31ms/step 1/1 [==============================] - 0s 31ms/step 1/1 [==============================] - 0s 25ms/step 1/1 [==============================] - 0s 30ms/step 1/1 [==============================] - 0s 26ms/step 1/1 [==============================] - 0s 45ms/step 1/1 [==============================] - 0s 24ms/step 1/1 [==============================] - 0s 26ms/step 1/1 [==============================] - 0s 27ms/step 1/1 [==============================] - 0s 24ms/step 1/1 [==============================] - 0s 25ms/step 1/1 [==============================] - 0s 28ms/step 1/1 [==============================] - 0s 26ms/step 1/1 [==============================] - 0s 26ms/step 1/1 [==============================] - 0s 49ms/step 1/1 [==============================] - 0s 26ms/step 1/1 [==============================] - 0s 26ms/step 1/1 [==============================] - 0s 31ms/step 1/1 [==============================] - 0s 25ms/step 1/1 [==============================] - 0s 30ms/step 1/1 [==============================] - 0s 25ms/step 1/1 [==============================] - 0s 35ms/step 1/1 [==============================] - 0s 25ms/step 1/1 [==============================] - 0s 28ms/step 1/1 [==============================] - 0s 35ms/step 1/1 [==============================] - 0s 28ms/step 1/1 [==============================] - 0s 27ms/step 1/1 [==============================] - 0s 26ms/step 1/1 [==============================] - 0s 42ms/step 1/1 [==============================] - 0s 31ms/step 1/1 [==============================] - 0s 26ms/step 1/1 [==============================] - 0s 28ms/step 1/1 [==============================] - 0s 25ms/step 1/1 [==============================] - 0s 28ms/step 1/1 [==============================] - 0s 28ms/step 1/1 [==============================] - 0s 26ms/step 1/1 [==============================] - 0s 27ms/step 1/1 [==============================] - 0s 28ms/step 1/1 [==============================] - 0s 26ms/step 1/1 [==============================] - 0s 26ms/step 1/1 [==============================] - 0s 27ms/step 1/1 [==============================] - 0s 33ms/step 1/1 [==============================] - 0s 27ms/step 1/1 [==============================] - 0s 23ms/step 1/1 [==============================] - 0s 25ms/step 1/1 [==============================] - 0s 35ms/step 1/1 [==============================] - 0s 26ms/step 1/1 [==============================] - 0s 26ms/step 1/1 [==============================] - 0s 29ms/step 1/1 [==============================] - 0s 27ms/step 1/1 [==============================] - 0s 35ms/step 1/1 [==============================] - 0s 25ms/step 1/1 [==============================] - 0s 26ms/step 1/1 [==============================] - 0s 26ms/step 1/1 [==============================] - 0s 32ms/step 1/1 [==============================] - 0s 27ms/step 1/1 [==============================] - 0s 24ms/step 1/1 [==============================] - 0s 25ms/step 1/1 [==============================] - 0s 26ms/step 1/1 [==============================] - 0s 24ms/step 1/1 [==============================] - 0s 49ms/step 1/1 [==============================] - 0s 27ms/step 1/1 [==============================] - 0s 25ms/step 1/1 [==============================] - 0s 27ms/step 1/1 [==============================] - 0s 25ms/step 1/1 [==============================] - 0s 27ms/step 1/1 [==============================] - 0s 31ms/step 1/1 [==============================] - 0s 49ms/step 1/1 [==============================] - 0s 27ms/step 1/1 [==============================] - 0s 28ms/step 1/1 [==============================] - 0s 27ms/step 1/1 [==============================] - 0s 25ms/step 1/1 [==============================] - 0s 25ms/step 1/1 [==============================] - 0s 28ms/step 1/1 [==============================] - 0s 33ms/step 1/1 [==============================] - 0s 26ms/step 1/1 [==============================] - 0s 25ms/step 1/1 [==============================] - 0s 31ms/step 1/1 [==============================] - 0s 31ms/step 1/1 [==============================] - 0s 31ms/step 1/1 [==============================] - 0s 43ms/step 1/1 [==============================] - 0s 31ms/step 1/1 [==============================] - 0s 29ms/step 1/1 [==============================] - 0s 27ms/step 1/1 [==============================] - 0s 28ms/step 1/1 [==============================] - 0s 28ms/step 1/1 [==============================] - 0s 30ms/step 1/1 [==============================] - 0s 30ms/step 1/1 [==============================] - 0s 25ms/step 1/1 [==============================] - 0s 26ms/step 1/1 [==============================] - 0s 28ms/step 1/1 [==============================] - 0s 29ms/step 1/1 [==============================] - 0s 27ms/step 1/1 [==============================] - 0s 27ms/step 1/1 [==============================] - 0s 25ms/step 1/1 [==============================] - 0s 26ms/step 1/1 [==============================] - 0s 25ms/step 1/1 [==============================] - 0s 26ms/step 1/1 [==============================] - 0s 31ms/step 1/1 [==============================] - 0s 26ms/step 1/1 [==============================] - 0s 28ms/step 1/1 [==============================] - 0s 29ms/step 1/1 [==============================] - 0s 32ms/step 1/1 [==============================] - 0s 27ms/step 1/1 [==============================] - 0s 36ms/step 1/1 [==============================] - 0s 28ms/step 1/1 [==============================] - 0s 25ms/step 1/1 [==============================] - 0s 29ms/step 1/1 [==============================] - 0s 28ms/step 1/1 [==============================] - 0s 29ms/step 1/1 [==============================] - 0s 32ms/step 1/1 [==============================] - 0s 35ms/step 1/1 [==============================] - 0s 27ms/step 1/1 [==============================] - 0s 32ms/step 1/1 [==============================] - 0s 26ms/step 1/1 [==============================] - 0s 27ms/step 1/1 [==============================] - 0s 27ms/step 1/1 [==============================] - 0s 26ms/step 1/1 [==============================] - 0s 28ms/step 1/1 [==============================] - 0s 28ms/step 1/1 [==============================] - 0s 30ms/step 1/1 [==============================] - 0s 28ms/step 1/1 [==============================] - 0s 39ms/step 1/1 [==============================] - 0s 27ms/step 1/1 [==============================] - 0s 29ms/step 1/1 [==============================] - 0s 26ms/step 1/1 [==============================] - 0s 26ms/step 1/1 [==============================] - 0s 28ms/step 1/1 [==============================] - 0s 27ms/step 1/1 [==============================] - 0s 26ms/step 1/1 [==============================] - 0s 27ms/step 1/1 [==============================] - 0s 27ms/step 1/1 [==============================] - 0s 27ms/step 1/1 [==============================] - 0s 27ms/step 1/1 [==============================] - 0s 32ms/step 1/1 [==============================] - 0s 27ms/step 1/1 [==============================] - 0s 30ms/step 1/1 [==============================] - 0s 36ms/step 1/1 [==============================] - 0s 26ms/step 1/1 [==============================] - 0s 28ms/step 1/1 [==============================] - 0s 30ms/step 1/1 [==============================] - 0s 40ms/step 1/1 [==============================] - 0s 27ms/step 1/1 [==============================] - 0s 27ms/step 1/1 [==============================] - 0s 31ms/step 1/1 [==============================] - 0s 27ms/step 1/1 [==============================] - 0s 27ms/step 1/1 [==============================] - 0s 26ms/step 1/1 [==============================] - 0s 32ms/step 1/1 [==============================] - 0s 29ms/step 1/1 [==============================] - 0s 26ms/step 1/1 [==============================] - 0s 26ms/step 1/1 [==============================] - 0s 24ms/step 1/1 [==============================] - 0s 32ms/step 1/1 [==============================] - 0s 27ms/step 1/1 [==============================] - 0s 33ms/step 1/1 [==============================] - 0s 30ms/step 1/1 [==============================] - 0s 29ms/step 1/1 [==============================] - 0s 33ms/step 1/1 [==============================] - 0s 33ms/step 1/1 [==============================] - 0s 29ms/step 1/1 [==============================] - 0s 42ms/step 1/1 [==============================] - 0s 28ms/step 1/1 [==============================] - 0s 27ms/step 1/1 [==============================] - 0s 27ms/step 1/1 [==============================] - 0s 28ms/step 1/1 [==============================] - 0s 28ms/step 1/1 [==============================] - 0s 30ms/step 1/1 [==============================] - 0s 29ms/step 1/1 [==============================] - 0s 35ms/step 1/1 [==============================] - 0s 30ms/step 1/1 [==============================] - 0s 28ms/step 1/1 [==============================] - 0s 27ms/step 1/1 [==============================] - 0s 31ms/step 1/1 [==============================] - 0s 28ms/step 1/1 [==============================] - 0s 27ms/step 1/1 [==============================] - 0s 31ms/step 1/1 [==============================] - 0s 27ms/step 1/1 [==============================] - 0s 28ms/step 1/1 [==============================] - 0s 27ms/step 1/1 [==============================] - 0s 26ms/step 1/1 [==============================] - 0s 32ms/step 1/1 [==============================] - 0s 30ms/step 1/1 [==============================] - 0s 29ms/step 1/1 [==============================] - 0s 26ms/step 1/1 [==============================] - 0s 29ms/step 1/1 [==============================] - 0s 27ms/step 1/1 [==============================] - 0s 27ms/step 1/1 [==============================] - 0s 30ms/step 1/1 [==============================] - 0s 26ms/step 1/1 [==============================] - 0s 27ms/step 1/1 [==============================] - 0s 30ms/step 1/1 [==============================] - 0s 26ms/step 1/1 [==============================] - 0s 28ms/step 1/1 [==============================] - 0s 35ms/step 1/1 [==============================] - 0s 26ms/step 1/1 [==============================] - 0s 29ms/step 1/1 [==============================] - 0s 29ms/step 1/1 [==============================] - 0s 27ms/step 1/1 [==============================] - 0s 27ms/step 1/1 [==============================] - 0s 36ms/step 1/1 [==============================] - 0s 26ms/step 1/1 [==============================] - 0s 36ms/step 1/1 [==============================] - 0s 32ms/step 1/1 [==============================] - 0s 28ms/step 1/1 [==============================] - 0s 32ms/step 1/1 [==============================] - 0s 28ms/step 1/1 [==============================] - 0s 41ms/step 1/1 [==============================] - 0s 31ms/step 1/1 [==============================] - 0s 28ms/step 1/1 [==============================] - 0s 30ms/step 1/1 [==============================] - 0s 26ms/step 1/1 [==============================] - 0s 34ms/step 1/1 [==============================] - 0s 26ms/step 1/1 [==============================] - 0s 39ms/step 1/1 [==============================] - 0s 35ms/step 1/1 [==============================] - 0s 28ms/step 1/1 [==============================] - 0s 27ms/step 1/1 [==============================] - 0s 31ms/step 1/1 [==============================] - 0s 31ms/step 1/1 [==============================] - 0s 31ms/step 1/1 [==============================] - 0s 30ms/step 1/1 [==============================] - 0s 29ms/step 1/1 [==============================] - 0s 28ms/step 1/1 [==============================] - 0s 27ms/step 1/1 [==============================] - 0s 26ms/step 1/1 [==============================] - 0s 29ms/step 1/1 [==============================] - 0s 34ms/step 1/1 [==============================] - 0s 37ms/step 1/1 [==============================] - 0s 32ms/step 1/1 [==============================] - 0s 27ms/step 1/1 [==============================] - 0s 26ms/step 1/1 [==============================] - 0s 27ms/step 1/1 [==============================] - 0s 26ms/step 1/1 [==============================] - 0s 26ms/step 1/1 [==============================] - 0s 27ms/step 1/1 [==============================] - 0s 26ms/step 1/1 [==============================] - 0s 33ms/step 1/1 [==============================] - 0s 28ms/step 1/1 [==============================] - 0s 27ms/step 1/1 [==============================] - 0s 29ms/step 1/1 [==============================] - 0s 30ms/step 1/1 [==============================] - 0s 28ms/step 1/1 [==============================] - 0s 33ms/step 1/1 [==============================] - 0s 27ms/step 1/1 [==============================] - 0s 29ms/step 1/1 [==============================] - 0s 37ms/step 1/1 [==============================] - 0s 38ms/step 1/1 [==============================] - 0s 29ms/step 1/1 [==============================] - 0s 29ms/step 1/1 [==============================] - 0s 34ms/step 1/1 [==============================] - 0s 30ms/step 1/1 [==============================] - 0s 29ms/step 1/1 [==============================] - 0s 28ms/step 1/1 [==============================] - 0s 40ms/step 1/1 [==============================] - 0s 28ms/step 1/1 [==============================] - 0s 29ms/step 1/1 [==============================] - 0s 37ms/step 1/1 [==============================] - 0s 25ms/step 1/1 [==============================] - 0s 44ms/step 1/1 [==============================] - 0s 28ms/step 1/1 [==============================] - 0s 28ms/step 1/1 [==============================] - 0s 30ms/step 1/1 [==============================] - 0s 28ms/step 1/1 [==============================] - 0s 29ms/step 1/1 [==============================] - 0s 25ms/step 1/1 [==============================] - 0s 28ms/step 1/1 [==============================] - 0s 30ms/step 1/1 [==============================] - 0s 27ms/step 1/1 [==============================] - 0s 28ms/step 1/1 [==============================] - 0s 31ms/step 1/1 [==============================] - 0s 26ms/step 1/1 [==============================] - 0s 27ms/step 1/1 [==============================] - 0s 30ms/step 1/1 [==============================] - 0s 35ms/step 1/1 [==============================] - 0s 28ms/step 1/1 [==============================] - 0s 27ms/step 1/1 [==============================] - 0s 30ms/step 1/1 [==============================] - 0s 29ms/step 1/1 [==============================] - 0s 30ms/step 1/1 [==============================] - 0s 28ms/step 1/1 [==============================] - 0s 27ms/step 1/1 [==============================] - 0s 29ms/step 1/1 [==============================] - 0s 25ms/step 1/1 [==============================] - 0s 32ms/step 1/1 [==============================] - 0s 29ms/step 1/1 [==============================] - 0s 27ms/step 1/1 [==============================] - 0s 30ms/step 1/1 [==============================] - 0s 30ms/step 1/1 [==============================] - 0s 35ms/step 1/1 [==============================] - 0s 28ms/step 1/1 [==============================] - 0s 27ms/step 1/1 [==============================] - 0s 26ms/step 1/1 [==============================] - 0s 26ms/step 1/1 [==============================] - 0s 27ms/step 1/1 [==============================] - 0s 26ms/step 1/1 [==============================] - 0s 26ms/step 1/1 [==============================] - 0s 53ms/step 1/1 [==============================] - 0s 31ms/step 1/1 [==============================] - 0s 28ms/step 1/1 [==============================] - 0s 27ms/step 1/1 [==============================] - 0s 27ms/step 1/1 [==============================] - 0s 27ms/step 1/1 [==============================] - 0s 42ms/step 1/1 [==============================] - 0s 30ms/step 1/1 [==============================] - 0s 54ms/step 1/1 [==============================] - 0s 28ms/step 1/1 [==============================] - 0s 29ms/step 1/1 [==============================] - 0s 27ms/step 1/1 [==============================] - 0s 28ms/step 1/1 [==============================] - 0s 27ms/step 1/1 [==============================] - 0s 27ms/step 1/1 [==============================] - 0s 29ms/step 1/1 [==============================] - 0s 29ms/step 1/1 [==============================] - 0s 34ms/step 1/1 [==============================] - 0s 27ms/step 1/1 [==============================] - 0s 29ms/step 1/1 [==============================] - 0s 26ms/step 1/1 [==============================] - 0s 51ms/step 1/1 [==============================] - 0s 26ms/step 1/1 [==============================] - 0s 27ms/step 1/1 [==============================] - 0s 26ms/step 1/1 [==============================] - 0s 34ms/step 1/1 [==============================] - 0s 28ms/step 1/1 [==============================] - 0s 29ms/step 1/1 [==============================] - 0s 28ms/step 1/1 [==============================] - 0s 31ms/step 1/1 [==============================] - 0s 32ms/step 1/1 [==============================] - 0s 27ms/step 1/1 [==============================] - 0s 46ms/step 1/1 [==============================] - 0s 28ms/step 1/1 [==============================] - 0s 30ms/step 1/1 [==============================] - 0s 28ms/step 1/1 [==============================] - 0s 35ms/step 1/1 [==============================] - 0s 29ms/step 1/1 [==============================] - 0s 28ms/step 1/1 [==============================] - 0s 33ms/step 1/1 [==============================] - 0s 34ms/step 1/1 [==============================] - 0s 28ms/step 1/1 [==============================] - 0s 26ms/step 1/1 [==============================] - 0s 27ms/step 1/1 [==============================] - 0s 28ms/step 1/1 [==============================] - 0s 28ms/step 1/1 [==============================] - 0s 27ms/step 1/1 [==============================] - 0s 28ms/step 1/1 [==============================] - 0s 28ms/step 1/1 [==============================] - 0s 33ms/step 1/1 [==============================] - 0s 28ms/step 1/1 [==============================] - 0s 28ms/step 1/1 [==============================] - 0s 29ms/step 1/1 [==============================] - 0s 26ms/step 1/1 [==============================] - 0s 38ms/step 1/1 [==============================] - 0s 29ms/step 1/1 [==============================] - 0s 29ms/step 1/1 [==============================] - 0s 28ms/step 1/1 [==============================] - 0s 30ms/step 1/1 [==============================] - 0s 27ms/step 1/1 [==============================] - 0s 33ms/step 1/1 [==============================] - 0s 30ms/step 1/1 [==============================] - 0s 33ms/step 1/1 [==============================] - 0s 27ms/step 1/1 [==============================] - 0s 37ms/step 1/1 [==============================] - 0s 29ms/step 1/1 [==============================] - 0s 29ms/step 1/1 [==============================] - 0s 26ms/step 1/1 [==============================] - 0s 33ms/step 1/1 [==============================] - 0s 28ms/step 1/1 [==============================] - 0s 29ms/step 1/1 [==============================] - 0s 33ms/step 1/1 [==============================] - 0s 29ms/step 1/1 [==============================] - 0s 27ms/step 1/1 [==============================] - 0s 33ms/step 1/1 [==============================] - 0s 28ms/step 1/1 [==============================] - 0s 30ms/step 1/1 [==============================] - 0s 27ms/step 1/1 [==============================] - 0s 26ms/step 1/1 [==============================] - 0s 27ms/step 1/1 [==============================] - 0s 32ms/step 1/1 [==============================] - 0s 26ms/step 1/1 [==============================] - 0s 28ms/step 1/1 [==============================] - 0s 27ms/step 1/1 [==============================] - 0s 29ms/step 1/1 [==============================] - 0s 41ms/step 1/1 [==============================] - 0s 26ms/step 1/1 [==============================] - 0s 29ms/step 1/1 [==============================] - 0s 28ms/step 1/1 [==============================] - 0s 35ms/step 1/1 [==============================] - 0s 32ms/step 1/1 [==============================] - 0s 28ms/step 1/1 [==============================] - 0s 30ms/step 1/1 [==============================] - 0s 34ms/step 1/1 [==============================] - 0s 27ms/step 1/1 [==============================] - 0s 33ms/step 1/1 [==============================] - 0s 28ms/step 1/1 [==============================] - 0s 27ms/step 1/1 [==============================] - 0s 27ms/step 1/1 [==============================] - 0s 27ms/step 1/1 [==============================] - 0s 31ms/step 1/1 [==============================] - 0s 28ms/step 1/1 [==============================] - 0s 32ms/step 1/1 [==============================] - 0s 33ms/step 1/1 [==============================] - 0s 29ms/step 1/1 [==============================] - 0s 27ms/step 1/1 [==============================] - 0s 26ms/step 1/1 [==============================] - 0s 27ms/step 1/1 [==============================] - 0s 26ms/step 1/1 [==============================] - 0s 31ms/step 1/1 [==============================] - 0s 31ms/step 1/1 [==============================] - 0s 34ms/step 1/1 [==============================] - 0s 36ms/step 1/1 [==============================] - 0s 29ms/step 1/1 [==============================] - 0s 37ms/step 1/1 [==============================] - 0s 28ms/step 1/1 [==============================] - 0s 28ms/step 1/1 [==============================] - 0s 32ms/step 1/1 [==============================] - 0s 31ms/step 1/1 [==============================] - 0s 26ms/step 1/1 [==============================] - 0s 30ms/step 1/1 [==============================] - 0s 27ms/step 1/1 [==============================] - 0s 31ms/step 1/1 [==============================] - 0s 28ms/step 1/1 [==============================] - 0s 28ms/step 1/1 [==============================] - 0s 31ms/step 1/1 [==============================] - 0s 45ms/step 1/1 [==============================] - 0s 30ms/step 1/1 [==============================] - 0s 33ms/step 1/1 [==============================] - 0s 35ms/step 1/1 [==============================] - 0s 31ms/step 1/1 [==============================] - 0s 32ms/step 1/1 [==============================] - 0s 27ms/step 1/1 [==============================] - 0s 29ms/step 1/1 [==============================] - 0s 30ms/step 1/1 [==============================] - 0s 29ms/step 1/1 [==============================] - 0s 48ms/step 1/1 [==============================] - 0s 27ms/step 1/1 [==============================] - 0s 31ms/step 1/1 [==============================] - 0s 29ms/step 1/1 [==============================] - 0s 30ms/step 1/1 [==============================] - 0s 32ms/step 1/1 [==============================] - 0s 29ms/step 1/1 [==============================] - 0s 26ms/step 1/1 [==============================] - 0s 29ms/step 1/1 [==============================] - 0s 28ms/step 1/1 [==============================] - 0s 33ms/step 1/1 [==============================] - 0s 28ms/step 1/1 [==============================] - 0s 29ms/step 1/1 [==============================] - 0s 28ms/step 1/1 [==============================] - 0s 35ms/step 1/1 [==============================] - 0s 30ms/step 1/1 [==============================] - 0s 31ms/step 1/1 [==============================] - 0s 30ms/step 1/1 [==============================] - 0s 29ms/step 1/1 [==============================] - 0s 29ms/step 1/1 [==============================] - 0s 28ms/step 1/1 [==============================] - 0s 30ms/step 1/1 [==============================] - 0s 32ms/step 1/1 [==============================] - 0s 27ms/step 1/1 [==============================] - 0s 45ms/step 1/1 [==============================] - 0s 27ms/step 1/1 [==============================] - 0s 27ms/step 1/1 [==============================] - 0s 28ms/step 1/1 [==============================] - 0s 28ms/step 1/1 [==============================] - 0s 28ms/step 1/1 [==============================] - 0s 44ms/step 1/1 [==============================] - 0s 31ms/step 1/1 [==============================] - 0s 29ms/step 1/1 [==============================] - 0s 36ms/step 1/1 [==============================] - 0s 28ms/step 1/1 [==============================] - 0s 29ms/step 1/1 [==============================] - 0s 29ms/step 1/1 [==============================] - 0s 33ms/step 1/1 [==============================] - 0s 27ms/step 1/1 [==============================] - 0s 31ms/step 1/1 [==============================] - 0s 26ms/step 1/1 [==============================] - 0s 27ms/step 1/1 [==============================] - 0s 27ms/step 1/1 [==============================] - 0s 30ms/step 1/1 [==============================] - 0s 27ms/step 1/1 [==============================] - 0s 29ms/step 1/1 [==============================] - 0s 29ms/step 1/1 [==============================] - 0s 29ms/step 1/1 [==============================] - 0s 26ms/step 1/1 [==============================] - 0s 29ms/step 1/1 [==============================] - 0s 30ms/step 1/1 [==============================] - 0s 30ms/step 1/1 [==============================] - 0s 27ms/step 1/1 [==============================] - 0s 27ms/step 1/1 [==============================] - 0s 32ms/step 1/1 [==============================] - 0s 28ms/step 1/1 [==============================] - 0s 28ms/step 1/1 [==============================] - 0s 33ms/step 1/1 [==============================] - 0s 33ms/step 1/1 [==============================] - 0s 46ms/step 1/1 [==============================] - 0s 29ms/step 1/1 [==============================] - 0s 29ms/step 1/1 [==============================] - 0s 29ms/step 1/1 [==============================] - 0s 29ms/step 1/1 [==============================] - 0s 28ms/step 1/1 [==============================] - 0s 30ms/step 1/1 [==============================] - 0s 41ms/step 1/1 [==============================] - 0s 28ms/step 1/1 [==============================] - 0s 29ms/step 1/1 [==============================] - 0s 42ms/step 1/1 [==============================] - 0s 30ms/step 1/1 [==============================] - 0s 26ms/step 1/1 [==============================] - 0s 28ms/step 1/1 [==============================] - 0s 29ms/step 1/1 [==============================] - 0s 34ms/step 1/1 [==============================] - 0s 32ms/step 1/1 [==============================] - 0s 28ms/step 1/1 [==============================] - 0s 29ms/step 1/1 [==============================] - 0s 29ms/step 1/1 [==============================] - 0s 29ms/step 1/1 [==============================] - 0s 33ms/step 1/1 [==============================] - 0s 26ms/step 1/1 [==============================] - 0s 36ms/step 1/1 [==============================] - 0s 30ms/step 1/1 [==============================] - 0s 38ms/step 1/1 [==============================] - 0s 33ms/step 1/1 [==============================] - 0s 38ms/step 1/1 [==============================] - 0s 32ms/step 1/1 [==============================] - 0s 28ms/step 1/1 [==============================] - 0s 28ms/step 1/1 [==============================] - 0s 46ms/step 1/1 [==============================] - 0s 28ms/step 1/1 [==============================] - 0s 30ms/step 1/1 [==============================] - 0s 30ms/step 1/1 [==============================] - 0s 28ms/step 1/1 [==============================] - 0s 30ms/step 1/1 [==============================] - 0s 28ms/step 1/1 [==============================] - 0s 33ms/step 1/1 [==============================] - 0s 28ms/step 1/1 [==============================] - 0s 35ms/step 1/1 [==============================] - 0s 36ms/step 1/1 [==============================] - 0s 32ms/step 1/1 [==============================] - 0s 30ms/step 1/1 [==============================] - 0s 27ms/step 1/1 [==============================] - 0s 28ms/step 1/1 [==============================] - 0s 26ms/step 1/1 [==============================] - 0s 35ms/step 1/1 [==============================] - 0s 34ms/step 1/1 [==============================] - 0s 27ms/step 1/1 [==============================] - 0s 30ms/step 1/1 [==============================] - 0s 27ms/step 1/1 [==============================] - 0s 29ms/step 1/1 [==============================] - 0s 28ms/step 1/1 [==============================] - 0s 29ms/step 1/1 [==============================] - 0s 28ms/step 1/1 [==============================] - 0s 31ms/step 1/1 [==============================] - 0s 29ms/step 1/1 [==============================] - 0s 30ms/step 1/1 [==============================] - 0s 28ms/step 1/1 [==============================] - 0s 27ms/step 1/1 [==============================] - 0s 28ms/step 1/1 [==============================] - 0s 29ms/step 1/1 [==============================] - 0s 30ms/step 1/1 [==============================] - 0s 31ms/step 1/1 [==============================] - 0s 27ms/step 1/1 [==============================] - 0s 27ms/step 1/1 [==============================] - 0s 29ms/step 1/1 [==============================] - 0s 37ms/step 1/1 [==============================] - 0s 29ms/step 1/1 [==============================] - 0s 31ms/step 1/1 [==============================] - 0s 29ms/step 1/1 [==============================] - 0s 28ms/step 1/1 [==============================] - 0s 30ms/step 1/1 [==============================] - 0s 39ms/step 1/1 [==============================] - 0s 28ms/step 1/1 [==============================] - 0s 37ms/step 1/1 [==============================] - 0s 25ms/step 1/1 [==============================] - 0s 28ms/step 1/1 [==============================] - 0s 28ms/step 1/1 [==============================] - 0s 27ms/step 1/1 [==============================] - 0s 28ms/step 1/1 [==============================] - 0s 28ms/step 1/1 [==============================] - 0s 30ms/step 1/1 [==============================] - 0s 28ms/step 1/1 [==============================] - 0s 30ms/step 1/1 [==============================] - 0s 28ms/step 1/1 [==============================] - 0s 36ms/step 1/1 [==============================] - 0s 27ms/step 1/1 [==============================] - 0s 28ms/step 1/1 [==============================] - 0s 35ms/step 1/1 [==============================] - 0s 37ms/step 1/1 [==============================] - 0s 30ms/step 1/1 [==============================] - 0s 27ms/step 1/1 [==============================] - 0s 30ms/step 1/1 [==============================] - 0s 27ms/step 1/1 [==============================] - 0s 30ms/step 1/1 [==============================] - 0s 41ms/step 1/1 [==============================] - 0s 29ms/step 1/1 [==============================] - 0s 40ms/step 1/1 [==============================] - 0s 33ms/step 1/1 [==============================] - 0s 28ms/step 1/1 [==============================] - 0s 30ms/step 1/1 [==============================] - 0s 28ms/step 1/1 [==============================] - 0s 28ms/step 1/1 [==============================] - 0s 29ms/step 1/1 [==============================] - 0s 34ms/step 1/1 [==============================] - 0s 31ms/step 1/1 [==============================] - 0s 31ms/step 1/1 [==============================] - 0s 36ms/step 1/1 [==============================] - 0s 30ms/step 1/1 [==============================] - 0s 30ms/step 1/1 [==============================] - 0s 28ms/step 1/1 [==============================] - 0s 32ms/step 1/1 [==============================] - 0s 27ms/step 1/1 [==============================] - 0s 30ms/step 1/1 [==============================] - 0s 32ms/step 1/1 [==============================] - 0s 28ms/step 1/1 [==============================] - 0s 28ms/step 1/1 [==============================] - 0s 29ms/step 1/1 [==============================] - 0s 39ms/step 1/1 [==============================] - 0s 28ms/step 1/1 [==============================] - 0s 30ms/step 1/1 [==============================] - 0s 35ms/step 1/1 [==============================] - 0s 31ms/step 1/1 [==============================] - 0s 28ms/step 1/1 [==============================] - 0s 28ms/step 1/1 [==============================] - 0s 30ms/step 1/1 [==============================] - 0s 30ms/step 1/1 [==============================] - 0s 27ms/step 1/1 [==============================] - 0s 29ms/step 1/1 [==============================] - 0s 33ms/step 1/1 [==============================] - 0s 27ms/step 1/1 [==============================] - 0s 34ms/step 1/1 [==============================] - 0s 28ms/step 1/1 [==============================] - 0s 29ms/step 1/1 [==============================] - 0s 27ms/step 1/1 [==============================] - 0s 34ms/step 1/1 [==============================] - 0s 29ms/step 1/1 [==============================] - 0s 27ms/step 1/1 [==============================] - 0s 29ms/step 1/1 [==============================] - 0s 30ms/step 1/1 [==============================] - 0s 51ms/step 1/1 [==============================] - 0s 29ms/step 1/1 [==============================] - 0s 28ms/step 1/1 [==============================] - 0s 29ms/step 1/1 [==============================] - 0s 28ms/step 1/1 [==============================] - 0s 33ms/step 1/1 [==============================] - 0s 32ms/step 1/1 [==============================] - 0s 27ms/step 1/1 [==============================] - 0s 29ms/step 1/1 [==============================] - 0s 25ms/step 1/1 [==============================] - 0s 31ms/step 1/1 [==============================] - 0s 28ms/step 1/1 [==============================] - 0s 28ms/step 1/1 [==============================] - 0s 29ms/step 1/1 [==============================] - 0s 27ms/step 1/1 [==============================] - 0s 31ms/step 1/1 [==============================] - 0s 29ms/step 1/1 [==============================] - 0s 32ms/step 1/1 [==============================] - 0s 34ms/step 1/1 [==============================] - 0s 27ms/step 1/1 [==============================] - 0s 28ms/step 1/1 [==============================] - 0s 30ms/step 1/1 [==============================] - 0s 29ms/step 1/1 [==============================] - 0s 32ms/step 1/1 [==============================] - 0s 29ms/step 1/1 [==============================] - 0s 29ms/step 1/1 [==============================] - 0s 30ms/step 1/1 [==============================] - 0s 31ms/step 1/1 [==============================] - 0s 27ms/step 1/1 [==============================] - 0s 45ms/step 1/1 [==============================] - 0s 35ms/step 1/1 [==============================] - 0s 28ms/step 1/1 [==============================] - 0s 35ms/step 1/1 [==============================] - 0s 27ms/step 1/1 [==============================] - 0s 28ms/step 1/1 [==============================] - 0s 30ms/step 1/1 [==============================] - 0s 28ms/step 1/1 [==============================] - 0s 27ms/step 1/1 [==============================] - 0s 32ms/step 1/1 [==============================] - 0s 36ms/step 1/1 [==============================] - 0s 45ms/step 1/1 [==============================] - 0s 29ms/step 1/1 [==============================] - 0s 28ms/step 1/1 [==============================] - 0s 32ms/step 1/1 [==============================] - 0s 28ms/step 1/1 [==============================] - 0s 30ms/step 1/1 [==============================] - 0s 28ms/step 1/1 [==============================] - 0s 29ms/step 1/1 [==============================] - 0s 35ms/step 1/1 [==============================] - 0s 33ms/step 1/1 [==============================] - 0s 36ms/step 1/1 [==============================] - 0s 27ms/step 1/1 [==============================] - 0s 26ms/step 1/1 [==============================] - 0s 28ms/step 1/1 [==============================] - 0s 29ms/step 1/1 [==============================] - 0s 28ms/step 1/1 [==============================] - 0s 28ms/step 1/1 [==============================] - 0s 28ms/step 1/1 [==============================] - 0s 29ms/step 1/1 [==============================] - 0s 32ms/step 1/1 [==============================] - 0s 28ms/step 1/1 [==============================] - 0s 27ms/step 1/1 [==============================] - 0s 32ms/step 1/1 [==============================] - 0s 44ms/step 1/1 [==============================] - 0s 30ms/step 1/1 [==============================] - 0s 29ms/step 1/1 [==============================] - 0s 30ms/step 1/1 [==============================] - 0s 39ms/step 1/1 [==============================] - 0s 28ms/step 1/1 [==============================] - 0s 29ms/step 1/1 [==============================] - 0s 30ms/step 1/1 [==============================] - 0s 30ms/step 1/1 [==============================] - 0s 27ms/step 1/1 [==============================] - 0s 31ms/step 1/1 [==============================] - 0s 27ms/step 1/1 [==============================] - 0s 28ms/step 1/1 [==============================] - 0s 30ms/step 1/1 [==============================] - 0s 27ms/step 1/1 [==============================] - 0s 26ms/step 1/1 [==============================] - 0s 39ms/step 1/1 [==============================] - 0s 29ms/step 1/1 [==============================] - 0s 29ms/step 1/1 [==============================] - 0s 29ms/step 1/1 [==============================] - 0s 29ms/step 1/1 [==============================] - 0s 28ms/step 1/1 [==============================] - 0s 45ms/step 1/1 [==============================] - 0s 35ms/step 1/1 [==============================] - 0s 27ms/step 1/1 [==============================] - 0s 35ms/step 1/1 [==============================] - 0s 29ms/step 1/1 [==============================] - 0s 29ms/step 1/1 [==============================] - 0s 34ms/step 1/1 [==============================] - 0s 28ms/step 1/1 [==============================] - 0s 29ms/step 1/1 [==============================] - 0s 33ms/step 1/1 [==============================] - 0s 27ms/step 1/1 [==============================] - 0s 35ms/step 1/1 [==============================] - 0s 33ms/step 1/1 [==============================] - 0s 28ms/step 1/1 [==============================] - 0s 27ms/step 1/1 [==============================] - 0s 29ms/step 1/1 [==============================] - 0s 31ms/step 1/1 [==============================] - 0s 28ms/step 1/1 [==============================] - 0s 27ms/step 1/1 [==============================] - 0s 29ms/step 1/1 [==============================] - 0s 34ms/step 1/1 [==============================] - 0s 30ms/step 1/1 [==============================] - 0s 29ms/step 1/1 [==============================] - 0s 28ms/step 1/1 [==============================] - 0s 33ms/step 1/1 [==============================] - 0s 28ms/step 1/1 [==============================] - 0s 28ms/step 1/1 [==============================] - 0s 30ms/step 1/1 [==============================] - 0s 28ms/step 1/1 [==============================] - 0s 31ms/step 1/1 [==============================] - 0s 28ms/step 1/1 [==============================] - 0s 29ms/step 1/1 [==============================] - 0s 29ms/step 1/1 [==============================] - 0s 26ms/step 1/1 [==============================] - 0s 33ms/step 1/1 [==============================] - 0s 31ms/step 1/1 [==============================] - 0s 29ms/step 1/1 [==============================] - 0s 28ms/step 1/1 [==============================] - 0s 30ms/step 1/1 [==============================] - 0s 28ms/step 1/1 [==============================] - 0s 29ms/step 1/1 [==============================] - 0s 28ms/step 1/1 [==============================] - 0s 29ms/step 1/1 [==============================] - 0s 29ms/step 1/1 [==============================] - 0s 30ms/step 1/1 [==============================] - 0s 32ms/step 1/1 [==============================] - 0s 29ms/step 1/1 [==============================] - 0s 30ms/step 1/1 [==============================] - 0s 26ms/step 1/1 [==============================] - 0s 27ms/step 1/1 [==============================] - 0s 29ms/step 1/1 [==============================] - 0s 28ms/step 1/1 [==============================] - 0s 28ms/step 1/1 [==============================] - 0s 33ms/step 1/1 [==============================] - 0s 26ms/step 1/1 [==============================] - 0s 28ms/step 1/1 [==============================] - 0s 27ms/step 1/1 [==============================] - 0s 32ms/step 1/1 [==============================] - 0s 32ms/step 1/1 [==============================] - 0s 29ms/step 1/1 [==============================] - 0s 32ms/step 1/1 [==============================] - 0s 28ms/step 1/1 [==============================] - 0s 38ms/step 1/1 [==============================] - 0s 34ms/step 1/1 [==============================] - 0s 33ms/step 1/1 [==============================] - 0s 28ms/step 1/1 [==============================] - 0s 36ms/step 1/1 [==============================] - 0s 33ms/step 1/1 [==============================] - 0s 28ms/step 1/1 [==============================] - 0s 29ms/step 1/1 [==============================] - 0s 27ms/step 1/1 [==============================] - 0s 29ms/step 1/1 [==============================] - 0s 30ms/step 1/1 [==============================] - 0s 28ms/step 1/1 [==============================] - 0s 29ms/step 1/1 [==============================] - 0s 28ms/step 1/1 [==============================] - 0s 36ms/step 1/1 [==============================] - 0s 41ms/step 1/1 [==============================] - 0s 29ms/step 1/1 [==============================] - 0s 29ms/step 1/1 [==============================] - 0s 36ms/step 1/1 [==============================] - 0s 37ms/step 1/1 [==============================] - 0s 28ms/step 1/1 [==============================] - 0s 31ms/step 1/1 [==============================] - 0s 33ms/step 1/1 [==============================] - 0s 28ms/step 1/1 [==============================] - 0s 27ms/step 1/1 [==============================] - 0s 41ms/step 1/1 [==============================] - 0s 28ms/step 1/1 [==============================] - 0s 38ms/step 1/1 [==============================] - 0s 30ms/step 1/1 [==============================] - 0s 37ms/step 1/1 [==============================] - 0s 29ms/step 1/1 [==============================] - 0s 29ms/step 1/1 [==============================] - 0s 29ms/step 1/1 [==============================] - 0s 30ms/step 1/1 [==============================] - 0s 31ms/step 1/1 [==============================] - 0s 28ms/step 1/1 [==============================] - 0s 32ms/step 1/1 [==============================] - 0s 34ms/step 1/1 [==============================] - 0s 36ms/step 1/1 [==============================] - 0s 28ms/step 1/1 [==============================] - 0s 31ms/step 1/1 [==============================] - 0s 28ms/step 1/1 [==============================] - 0s 29ms/step 1/1 [==============================] - 0s 37ms/step 1/1 [==============================] - 0s 34ms/step 1/1 [==============================] - 0s 39ms/step 1/1 [==============================] - 0s 29ms/step 1/1 [==============================] - 0s 33ms/step 1/1 [==============================] - 0s 30ms/step 1/1 [==============================] - 0s 29ms/step 1/1 [==============================] - 0s 30ms/step 1/1 [==============================] - 0s 30ms/step 1/1 [==============================] - 0s 29ms/step 1/1 [==============================] - 0s 31ms/step 1/1 [==============================] - 0s 41ms/step 1/1 [==============================] - 0s 30ms/step 1/1 [==============================] - 0s 29ms/step 1/1 [==============================] - 0s 31ms/step 1/1 [==============================] - 0s 40ms/step 1/1 [==============================] - 0s 28ms/step 1/1 [==============================] - 0s 29ms/step 1/1 [==============================] - 0s 30ms/step 1/1 [==============================] - 0s 31ms/step 1/1 [==============================] - 0s 31ms/step 1/1 [==============================] - 0s 29ms/step 1/1 [==============================] - 0s 30ms/step 1/1 [==============================] - 0s 36ms/step 1/1 [==============================] - 0s 31ms/step 1/1 [==============================] - 0s 34ms/step 1/1 [==============================] - 0s 37ms/step 1/1 [==============================] - 0s 29ms/step 1/1 [==============================] - 0s 32ms/step 1/1 [==============================] - 0s 32ms/step 1/1 [==============================] - 0s 29ms/step 1/1 [==============================] - 0s 32ms/step 1/1 [==============================] - 0s 32ms/step 1/1 [==============================] - 0s 70ms/step 1/1 [==============================] - 0s 30ms/step 1/1 [==============================] - 0s 46ms/step 1/1 [==============================] - 0s 33ms/step 1/1 [==============================] - 0s 29ms/step 1/1 [==============================] - 0s 30ms/step 1/1 [==============================] - 0s 36ms/step 1/1 [==============================] - 0s 30ms/step 1/1 [==============================] - 0s 32ms/step 1/1 [==============================] - 0s 33ms/step 1/1 [==============================] - 0s 33ms/step 1/1 [==============================] - 0s 28ms/step 1/1 [==============================] - 0s 33ms/step 1/1 [==============================] - 0s 39ms/step 1/1 [==============================] - 0s 29ms/step 1/1 [==============================] - 0s 30ms/step 1/1 [==============================] - 0s 35ms/step 1/1 [==============================] - 0s 33ms/step 1/1 [==============================] - 0s 30ms/step 1/1 [==============================] - 0s 29ms/step 1/1 [==============================] - 0s 30ms/step 1/1 [==============================] - 0s 33ms/step 1/1 [==============================] - 0s 31ms/step 1/1 [==============================] - 0s 44ms/step 1/1 [==============================] - 0s 30ms/step 1/1 [==============================] - 0s 29ms/step 1/1 [==============================] - 0s 33ms/step 1/1 [==============================] - 0s 32ms/step 1/1 [==============================] - 0s 37ms/step 1/1 [==============================] - 0s 31ms/step 1/1 [==============================] - 0s 30ms/step 1/1 [==============================] - 0s 30ms/step 1/1 [==============================] - 0s 35ms/step 1/1 [==============================] - 0s 29ms/step 1/1 [==============================] - 0s 58ms/step 1/1 [==============================] - 0s 34ms/step 1/1 [==============================] - 0s 30ms/step 1/1 [==============================] - 0s 47ms/step 1/1 [==============================] - 0s 40ms/step 1/1 [==============================] - 0s 48ms/step 1/1 [==============================] - 0s 32ms/step 1/1 [==============================] - 0s 39ms/step 1/1 [==============================] - 0s 33ms/step 1/1 [==============================] - 0s 35ms/step 1/1 [==============================] - 0s 35ms/step 1/1 [==============================] - 0s 52ms/step 1/1 [==============================] - 0s 35ms/step 1/1 [==============================] - 0s 36ms/step 1/1 [==============================] - 0s 36ms/step 1/1 [==============================] - 0s 35ms/step 1/1 [==============================] - 0s 36ms/step 1/1 [==============================] - 0s 41ms/step 1/1 [==============================] - 0s 47ms/step 1/1 [==============================] - 0s 43ms/step 1/1 [==============================] - 0s 36ms/step 1/1 [==============================] - 0s 35ms/step 1/1 [==============================] - 0s 37ms/step 1/1 [==============================] - 0s 45ms/step 1/1 [==============================] - 0s 37ms/step 1/1 [==============================] - 0s 35ms/step 1/1 [==============================] - 0s 35ms/step 1/1 [==============================] - 0s 43ms/step 1/1 [==============================] - 0s 48ms/step 1/1 [==============================] - 0s 44ms/step 1/1 [==============================] - 0s 36ms/step 1/1 [==============================] - 0s 36ms/step 1/1 [==============================] - 0s 36ms/step 1/1 [==============================] - 0s 36ms/step 1/1 [==============================] - 0s 37ms/step 1/1 [==============================] - 0s 37ms/step 1/1 [==============================] - 0s 43ms/step 1/1 [==============================] - 0s 52ms/step 1/1 [==============================] - 0s 34ms/step 1/1 [==============================] - 0s 36ms/step 1/1 [==============================] - 0s 49ms/step 1/1 [==============================] - 0s 36ms/step 1/1 [==============================] - 0s 46ms/step 1/1 [==============================] - 0s 42ms/step 1/1 [==============================] - 0s 41ms/step 1/1 [==============================] - 0s 36ms/step 1/1 [==============================] - 0s 39ms/step 1/1 [==============================] - 0s 38ms/step 1/1 [==============================] - 0s 36ms/step 1/1 [==============================] - 0s 38ms/step 1/1 [==============================] - 0s 36ms/step 1/1 [==============================] - 0s 37ms/step 1/1 [==============================] - 0s 38ms/step 1/1 [==============================] - 0s 40ms/step 1/1 [==============================] - 0s 38ms/step 1/1 [==============================] - 0s 38ms/step 1/1 [==============================] - 0s 36ms/step 1/1 [==============================] - 0s 41ms/step 1/1 [==============================] - 0s 41ms/step 1/1 [==============================] - 0s 37ms/step 1/1 [==============================] - 0s 36ms/step 1/1 [==============================] - 0s 47ms/step 1/1 [==============================] - 0s 35ms/step 1/1 [==============================] - 0s 37ms/step 1/1 [==============================] - 0s 44ms/step 1/1 [==============================] - 0s 34ms/step 1/1 [==============================] - 0s 40ms/step 1/1 [==============================] - 0s 36ms/step 1/1 [==============================] - 0s 35ms/step 1/1 [==============================] - 0s 45ms/step 1/1 [==============================] - 0s 36ms/step 1/1 [==============================] - 0s 36ms/step 1/1 [==============================] - 0s 35ms/step 1/1 [==============================] - 0s 35ms/step 1/1 [==============================] - 0s 36ms/step 1/1 [==============================] - 0s 36ms/step 1/1 [==============================] - 0s 36ms/step 1/1 [==============================] - 0s 37ms/step 1/1 [==============================] - 0s 35ms/step 1/1 [==============================] - 0s 36ms/step 1/1 [==============================] - 0s 40ms/step 1/1 [==============================] - 0s 51ms/step 1/1 [==============================] - 0s 53ms/step 1/1 [==============================] - 0s 35ms/step 1/1 [==============================] - 0s 38ms/step 1/1 [==============================] - 0s 36ms/step 1/1 [==============================] - 0s 37ms/step 1/1 [==============================] - 0s 40ms/step 1/1 [==============================] - 0s 49ms/step 1/1 [==============================] - 0s 40ms/step 1/1 [==============================] - 0s 46ms/step 1/1 [==============================] - 0s 39ms/step 1/1 [==============================] - 0s 38ms/step 1/1 [==============================] - 0s 40ms/step 1/1 [==============================] - 0s 36ms/step 1/1 [==============================] - 0s 37ms/step 1/1 [==============================] - 0s 39ms/step 1/1 [==============================] - 0s 39ms/step 1/1 [==============================] - 0s 51ms/step 1/1 [==============================] - 0s 39ms/step 1/1 [==============================] - 0s 37ms/step 1/1 [==============================] - 0s 40ms/step 1/1 [==============================] - 0s 36ms/step 1/1 [==============================] - 0s 43ms/step 1/1 [==============================] - 0s 46ms/step 1/1 [==============================] - 0s 49ms/step 1/1 [==============================] - 0s 37ms/step 1/1 [==============================] - 0s 36ms/step 1/1 [==============================] - 0s 36ms/step 1/1 [==============================] - 0s 35ms/step 1/1 [==============================] - 0s 36ms/step 1/1 [==============================] - 0s 35ms/step 1/1 [==============================] - 0s 35ms/step 1/1 [==============================] - 0s 38ms/step 1/1 [==============================] - 0s 44ms/step 1/1 [==============================] - 0s 39ms/step 1/1 [==============================] - 0s 40ms/step 1/1 [==============================] - 0s 36ms/step 1/1 [==============================] - 0s 42ms/step 1/1 [==============================] - 0s 36ms/step 1/1 [==============================] - 0s 39ms/step 1/1 [==============================] - 0s 38ms/step 1/1 [==============================] - 0s 39ms/step 1/1 [==============================] - 0s 38ms/step 1/1 [==============================] - 0s 37ms/step 1/1 [==============================] - 0s 38ms/step 1/1 [==============================] - 0s 38ms/step 1/1 [==============================] - 0s 40ms/step 1/1 [==============================] - 0s 37ms/step 1/1 [==============================] - 0s 52ms/step 1/1 [==============================] - 0s 34ms/step 1/1 [==============================] - 0s 41ms/step 1/1 [==============================] - 0s 36ms/step 1/1 [==============================] - 0s 41ms/step 1/1 [==============================] - 0s 40ms/step 1/1 [==============================] - 0s 39ms/step 1/1 [==============================] - 0s 37ms/step 1/1 [==============================] - 0s 49ms/step 1/1 [==============================] - 0s 34ms/step 1/1 [==============================] - 0s 37ms/step 1/1 [==============================] - 0s 36ms/step 1/1 [==============================] - 0s 47ms/step 1/1 [==============================] - 0s 36ms/step 1/1 [==============================] - 0s 36ms/step 1/1 [==============================] - 0s 37ms/step 1/1 [==============================] - 0s 36ms/step 1/1 [==============================] - 0s 38ms/step 1/1 [==============================] - 0s 44ms/step 1/1 [==============================] - 0s 36ms/step 1/1 [==============================] - 0s 56ms/step 1/1 [==============================] - 0s 43ms/step 1/1 [==============================] - 0s 39ms/step 1/1 [==============================] - 0s 42ms/step 1/1 [==============================] - 0s 43ms/step 1/1 [==============================] - 0s 41ms/step 1/1 [==============================] - 0s 41ms/step 1/1 [==============================] - 0s 37ms/step 1/1 [==============================] - 0s 36ms/step 1/1 [==============================] - 0s 39ms/step 1/1 [==============================] - 0s 41ms/step 1/1 [==============================] - 0s 37ms/step 1/1 [==============================] - 0s 41ms/step 1/1 [==============================] - 0s 44ms/step 1/1 [==============================] - 0s 43ms/step 1/1 [==============================] - 0s 37ms/step 1/1 [==============================] - 0s 39ms/step 1/1 [==============================] - 0s 35ms/step 1/1 [==============================] - 0s 36ms/step 1/1 [==============================] - 0s 36ms/step 1/1 [==============================] - 0s 42ms/step 1/1 [==============================] - 0s 47ms/step 1/1 [==============================] - 0s 38ms/step 1/1 [==============================] - 0s 38ms/step 1/1 [==============================] - 0s 48ms/step 1/1 [==============================] - 0s 37ms/step 1/1 [==============================] - 0s 39ms/step 1/1 [==============================] - 0s 37ms/step 1/1 [==============================] - 0s 37ms/step 1/1 [==============================] - 0s 37ms/step 1/1 [==============================] - 0s 39ms/step 1/1 [==============================] - 0s 37ms/step 1/1 [==============================] - 0s 37ms/step 1/1 [==============================] - 0s 39ms/step 1/1 [==============================] - 0s 43ms/step 1/1 [==============================] - 0s 36ms/step 1/1 [==============================] - 0s 36ms/step 1/1 [==============================] - 0s 36ms/step 1/1 [==============================] - 0s 47ms/step 1/1 [==============================] - 0s 52ms/step 1/1 [==============================] - 0s 40ms/step 1/1 [==============================] - 0s 43ms/step 1/1 [==============================] - 0s 43ms/step 1/1 [==============================] - 0s 37ms/step 1/1 [==============================] - 0s 37ms/step 1/1 [==============================] - 0s 40ms/step 1/1 [==============================] - 0s 35ms/step 1/1 [==============================] - 0s 39ms/step 1/1 [==============================] - 0s 35ms/step 1/1 [==============================] - 0s 35ms/step 1/1 [==============================] - 0s 36ms/step 1/1 [==============================] - 0s 37ms/step 1/1 [==============================] - 0s 39ms/step 1/1 [==============================] - 0s 45ms/step 1/1 [==============================] - 0s 37ms/step 1/1 [==============================] - 0s 36ms/step 1/1 [==============================] - 0s 38ms/step 1/1 [==============================] - 0s 40ms/step 1/1 [==============================] - 0s 45ms/step 1/1 [==============================] - 0s 45ms/step 1/1 [==============================] - 0s 38ms/step 1/1 [==============================] - 0s 38ms/step 1/1 [==============================] - 0s 36ms/step 1/1 [==============================] - 0s 46ms/step 1/1 [==============================] - 0s 38ms/step 1/1 [==============================] - 0s 44ms/step 1/1 [==============================] - 0s 34ms/step 1/1 [==============================] - 0s 36ms/step 1/1 [==============================] - 0s 38ms/step 1/1 [==============================] - 0s 37ms/step 1/1 [==============================] - 0s 43ms/step 1/1 [==============================] - 0s 37ms/step 1/1 [==============================] - 0s 35ms/step 1/1 [==============================] - 0s 51ms/step 1/1 [==============================] - 0s 46ms/step 1/1 [==============================] - 0s 38ms/step 1/1 [==============================] - 0s 38ms/step 1/1 [==============================] - 0s 36ms/step 1/1 [==============================] - 0s 38ms/step 1/1 [==============================] - 0s 38ms/step 1/1 [==============================] - 0s 36ms/step 1/1 [==============================] - 0s 43ms/step 1/1 [==============================] - 0s 41ms/step 1/1 [==============================] - 0s 37ms/step 1/1 [==============================] - 0s 54ms/step 1/1 [==============================] - 0s 38ms/step 1/1 [==============================] - 0s 36ms/step 1/1 [==============================] - 0s 38ms/step 1/1 [==============================] - 0s 48ms/step 1/1 [==============================] - 0s 38ms/step 1/1 [==============================] - 0s 39ms/step 1/1 [==============================] - 0s 35ms/step 1/1 [==============================] - 0s 41ms/step 1/1 [==============================] - 0s 40ms/step 1/1 [==============================] - 0s 45ms/step 1/1 [==============================] - 0s 36ms/step 1/1 [==============================] - 0s 37ms/step 1/1 [==============================] - 0s 40ms/step 1/1 [==============================] - 0s 38ms/step 1/1 [==============================] - 0s 38ms/step 1/1 [==============================] - 0s 53ms/step 1/1 [==============================] - 0s 36ms/step 1/1 [==============================] - 0s 42ms/step 1/1 [==============================] - 0s 38ms/step 1/1 [==============================] - 0s 38ms/step 1/1 [==============================] - 0s 38ms/step 1/1 [==============================] - 0s 41ms/step 1/1 [==============================] - 0s 61ms/step 1/1 [==============================] - 0s 36ms/step 1/1 [==============================] - 0s 40ms/step 1/1 [==============================] - 0s 35ms/step 1/1 [==============================] - 0s 36ms/step 1/1 [==============================] - 0s 55ms/step 1/1 [==============================] - 0s 45ms/step 1/1 [==============================] - 0s 45ms/step 1/1 [==============================] - 0s 36ms/step 1/1 [==============================] - 0s 37ms/step 1/1 [==============================] - 0s 38ms/step 1/1 [==============================] - 0s 38ms/step 1/1 [==============================] - 0s 39ms/step 1/1 [==============================] - 0s 38ms/step 1/1 [==============================] - 0s 36ms/step 1/1 [==============================] - 0s 44ms/step 1/1 [==============================] - 0s 37ms/step 1/1 [==============================] - 0s 39ms/step 1/1 [==============================] - 0s 39ms/step 1/1 [==============================] - 0s 38ms/step 1/1 [==============================] - 0s 42ms/step 1/1 [==============================] - 0s 47ms/step 1/1 [==============================] - 0s 37ms/step 1/1 [==============================] - 0s 38ms/step 1/1 [==============================] - 0s 36ms/step 1/1 [==============================] - 0s 36ms/step 1/1 [==============================] - 0s 37ms/step 1/1 [==============================] - 0s 36ms/step 1/1 [==============================] - 0s 38ms/step 1/1 [==============================] - 0s 41ms/step 1/1 [==============================] - 0s 39ms/step 1/1 [==============================] - 0s 40ms/step 1/1 [==============================] - 0s 39ms/step 1/1 [==============================] - 0s 39ms/step 1/1 [==============================] - 0s 47ms/step 1/1 [==============================] - 0s 45ms/step 1/1 [==============================] - 0s 37ms/step 1/1 [==============================] - 0s 40ms/step 1/1 [==============================] - 0s 36ms/step 1/1 [==============================] - 0s 37ms/step 1/1 [==============================] - 0s 40ms/step 1/1 [==============================] - 0s 53ms/step 1/1 [==============================] - 0s 42ms/step 1/1 [==============================] - 0s 40ms/step 1/1 [==============================] - 0s 37ms/step 1/1 [==============================] - 0s 45ms/step 1/1 [==============================] - 0s 39ms/step 1/1 [==============================] - 0s 39ms/step 1/1 [==============================] - 0s 55ms/step 1/1 [==============================] - 0s 39ms/step 1/1 [==============================] - 0s 51ms/step 1/1 [==============================] - 0s 37ms/step 1/1 [==============================] - 0s 43ms/step 1/1 [==============================] - 0s 37ms/step 1/1 [==============================] - 0s 37ms/step 1/1 [==============================] - 0s 40ms/step 1/1 [==============================] - 0s 39ms/step 1/1 [==============================] - 0s 37ms/step 1/1 [==============================] - 0s 38ms/step 1/1 [==============================] - 0s 40ms/step 1/1 [==============================] - 0s 37ms/step 1/1 [==============================] - 0s 36ms/step 1/1 [==============================] - 0s 37ms/step 1/1 [==============================] - 0s 44ms/step 1/1 [==============================] - 0s 38ms/step 1/1 [==============================] - 0s 43ms/step 1/1 [==============================] - 0s 37ms/step 1/1 [==============================] - 0s 36ms/step 1/1 [==============================] - 0s 40ms/step 1/1 [==============================] - 0s 41ms/step 1/1 [==============================] - 0s 36ms/step 1/1 [==============================] - 0s 37ms/step 1/1 [==============================] - 0s 45ms/step 1/1 [==============================] - 0s 38ms/step 1/1 [==============================] - 0s 39ms/step 1/1 [==============================] - 0s 65ms/step 1/1 [==============================] - 0s 46ms/step 1/1 [==============================] - 0s 37ms/step 1/1 [==============================] - 0s 55ms/step 1/1 [==============================] - 0s 40ms/step 1/1 [==============================] - 0s 40ms/step 1/1 [==============================] - 0s 47ms/step 1/1 [==============================] - 0s 39ms/step 1/1 [==============================] - 0s 46ms/step 1/1 [==============================] - 0s 39ms/step 1/1 [==============================] - 0s 42ms/step 1/1 [==============================] - 0s 44ms/step 1/1 [==============================] - 0s 37ms/step 1/1 [==============================] - 0s 51ms/step 1/1 [==============================] - 0s 37ms/step 1/1 [==============================] - 0s 38ms/step 1/1 [==============================] - 0s 37ms/step 1/1 [==============================] - 0s 38ms/step 1/1 [==============================] - 0s 38ms/step 1/1 [==============================] - 0s 40ms/step 1/1 [==============================] - 0s 38ms/step 1/1 [==============================] - 0s 39ms/step 1/1 [==============================] - 0s 39ms/step 1/1 [==============================] - 0s 43ms/step 1/1 [==============================] - 0s 57ms/step 1/1 [==============================] - 0s 40ms/step 1/1 [==============================] - 0s 66ms/step 1/1 [==============================] - 0s 35ms/step
Final_Test_Submission_CSV.columns = ["Patient_ID","X","Y","Width","Height"]
Final_Test_Submission_CSV.to_csv("Test_Submission_Final.csv",index = False)
Final_Test_Submission_CSV.sample(20).style.set_properties(**{'border': '1.3px solid black','color': 'blue'}).hide_index()
| Patient_ID | X | Y | Width | Height |
|---|---|---|---|---|
| 22dc3067-7c66-4ae2-adb1-cf4754359d14 | 267.171570 | 216.469208 | 225.522995 | 249.607925 |
| 263db843-751f-4cd1-b392-548fe40d2636 | 388.249847 | 466.375885 | 235.598846 | 326.777222 |
| c0471814-5d65-4868-94cb-e9ff37439619 | 283.811340 | 299.579376 | 185.577118 | 285.455994 |
| 1a5eef1c-c817-4c73-9e8b-d75eae0e4c0a | 294.320740 | 272.951447 | 221.707230 | 280.564148 |
| 23c0da4c-d481-45d5-8cf8-ac77f20e424c | 270.345032 | 302.309235 | 266.595032 | 241.008194 |
| 1a37de22-589c-4773-b97d-8334a2d674c4 | 294.136383 | 226.577957 | 210.520218 | 227.747040 |
| 04c2264b-9e77-464d-92e6-0c6cec3a2393 | 340.082642 | 294.464142 | 210.297058 | 286.055115 |
| 273bdee2-1dd4-4dd2-a29b-92a29023f6d3 | 189.841782 | 263.357239 | 208.102966 | 182.888489 |
| 2996eb94-ba9c-4d89-a9f4-531a15fec52f | 342.560089 | 225.835434 | 230.418259 | 285.384766 |
| 1f0056d6-a405-42d6-a088-123ed42bf1c6 | 253.523209 | 268.426208 | 226.568756 | 277.237335 |
| 265c655e-b97d-49b0-8b5c-83be37c0b80e | 303.792236 | 202.345703 | 230.229202 | 288.577026 |
| 116e1a13-ac55-499d-bcc9-ae9e0bd450d2 | 310.394897 | 290.330750 | 235.287308 | 341.448486 |
| 1371ec53-f3c3-4c63-8982-cb5a118c6448 | 280.623596 | 212.484619 | 180.887985 | 231.996277 |
| 1ac04f0b-2f0d-4179-bcc9-04bcd6ed8f76 | 279.109497 | 245.872421 | 199.125854 | 270.651947 |
| 2d5d0fc8-314d-41bd-adfd-52f1a1e3cc5c | 277.377167 | 234.236710 | 195.488251 | 247.216599 |
| c1e88810-9e4e-4f39-9306-8d314bfc1ff1 | 260.317596 | 305.770264 | 218.823318 | 244.611328 |
| 237724a1-b2fb-48e0-822b-5059d04f912e | 286.936920 | 276.594208 | 191.498108 | 265.020874 |
| 02eaa8c2-e818-4b39-92d3-aa092553b811 | 253.855942 | 257.963928 | 241.736465 | 244.059280 |
| 1ffb8eee-5a79-48a8-9f14-eada75c26c19 | 335.395569 | 517.963135 | 350.956085 | 447.869354 |
| 0e38f5ef-e94f-46d2-a67f-d3ca284e4351 | 263.479004 | 271.392517 | 233.654388 | 259.333710 |
dicom_test_final.head(1)
| PatientID | Image_file | Full_filename | |
|---|---|---|---|
| 0 | 0000a175-0e68-4ca4-b1af-167204a7e0bc | 0000a175-0e68-4ca4-b1af-167204a7e0bc.dcm | stage_2_test_images/0000a175-0e68-4ca4-b1af-167204a7e0bc.dcm |
def predicted_bound_box(passed_sample):
# Take a random sample of 5 images and print bounding box patch on images
sample_data = random.sample(list(Final_Test_Submission_CSV['Patient_ID']), passed_sample)
if passed_sample/5 == 0:
z = math.ceil(passed_sample/5)
else:
z = math.ceil((passed_sample)/5)
bounding_box_data = pd.DataFrame()
fig, ax = plt.subplots(z,5,constrained_layout=True)
fig.set_figheight(20)
fig.set_figwidth(20)
i = 0
j = 0
for loop_1 in sample_data:
pat_id_1 = loop_1
no_ind = dicom_test_final[dicom_test_final['PatientID'] == pat_id_1].index.values
l_ind = len(no_ind)
fname = pat_id_1+".dcm"
image_path = "stage_2_test_images"+"/"+fname
ds = dicom.dcmread(image_path)
img = ds.pixel_array
X = 0
Y = 0
width = 0
height = 0
rect = []
for k in range(l_ind):
X = Final_Test_Submission_CSV.iloc[no_ind[k],1]
Y = Final_Test_Submission_CSV.iloc[no_ind[k],2]
width = Final_Test_Submission_CSV.iloc[no_ind[k],3]
height = Final_Test_Submission_CSV.iloc[no_ind[k],4]
bounding_box_data = bounding_box_data.append(pd.Series([pat_id_1, l_ind,X,Y,width,height]), ignore_index = True)
rect.append(patches.Rectangle((X, Y), width, height,linewidth = 3,edgecolor = 'y',facecolor = 'none'))
ax[i][j].imshow(img, cmap=plt.cm.bone)
ax[i][j].add_patch(rect[k])
j = j+1
if j == 5:
i = i+1
j = 0
plt.figure().clear()
plt.show()
bounding_box_data.columns = ['Image Name','No. Of Bounding Boxes','X','Y','Width','Height']
return(bounding_box_data)
predicted_bound_box(20).style.set_properties(**{'border': '1.3px solid black','color': 'blue'}).hide_index()
<Figure size 640x480 with 0 Axes>
| Image Name | No. Of Bounding Boxes | X | Y | Width | Height |
|---|---|---|---|---|---|
| c028bb45-6dbc-4152-9bf5-3050e393c198 | 1 | 250.343033 | 230.100723 | 205.726562 | 289.238678 |
| 2cf1426e-9663-4c08-b37e-960e53f1225f | 1 | 277.015747 | 268.668854 | 256.627655 | 222.374466 |
| 015a202d-5e18-4dcf-802c-0f7e75e14a38 | 1 | 212.086563 | 254.331543 | 201.897614 | 196.143906 |
| c0a2978c-82f7-42e7-975d-c4eba156dfdd | 1 | 244.088974 | 228.080078 | 220.683701 | 243.352509 |
| 24aeb8f6-24bd-488b-bef0-2e8fa59418fe | 1 | 294.961884 | 306.318146 | 215.391434 | 267.564697 |
| 1ad5f416-0e7e-4f08-9e8d-abd19bbe25f5 | 1 | 318.210510 | 337.685699 | 265.675201 | 291.836395 |
| 1a583cd3-23d3-4789-a24f-bc7ae6fde815 | 1 | 273.241028 | 352.438049 | 269.918640 | 149.716599 |
| 0322a3ad-3e24-4e6d-8f24-b68e3c61d88f | 1 | 260.395050 | 284.164093 | 225.033081 | 233.988403 |
| 1afcf486-18db-4ffb-93e4-3a3d6a943110 | 1 | 279.387115 | 234.771301 | 251.331192 | 244.148941 |
| 2179339d-e6f0-49ba-a405-bfb4845fdf5a | 1 | 296.401459 | 237.253464 | 222.082657 | 237.527374 |
| 0ec7d9ce-ff6e-4160-8c20-d6622096114b | 1 | 229.267075 | 213.034485 | 221.902130 | 225.833008 |
| 13584fbe-6565-4aea-ba4e-3336996dc946 | 1 | 287.229462 | 330.500336 | 255.520538 | 262.809357 |
| 30388b30-3486-4269-b050-2f48d973e358 | 1 | 265.886810 | 208.528412 | 184.792389 | 265.630066 |
| 23870b78-01de-4d6b-be10-e3c605bea4d8 | 1 | 240.718781 | 257.663116 | 256.595734 | 207.305710 |
| 032f8386-fd1d-4ffd-b12a-263a4ce113ae | 1 | 205.167450 | 278.706024 | 233.753616 | 283.980072 |
| 0efedf3c-8bb9-44ed-85da-fe85f53cbe6a | 1 | 265.484161 | 273.406250 | 193.457748 | 232.397491 |
| 2c0009ad-af3a-47e9-afc8-ccf64caa6597 | 1 | 231.535538 | 211.308929 | 213.584030 | 221.010056 |
| 2073e3a8-945c-4e76-b911-1d79a25cd683 | 1 | 242.846390 | 280.766907 | 220.044830 | 301.732910 |
| 2860838c-8c66-4201-8418-ba961d011f75 | 1 | 324.090515 | 337.848328 | 193.627182 | 246.127792 |
| 040cfecb-0984-4393-a977-924c75795809 | 1 | 348.306824 | 384.154083 | 225.214951 | 348.796326 |
* Final Report is submitted as seperate PDF file with all the steps performed over Milestone 1 & Milestone 2